Global Hurricane Hours

Results Preview

Climate alarmists claim that hurricane frequency is increasing. I have already dealt with this for the Atlantic, here. There was no trend. Today, I will analyze global data.

NOAA sponsors the largest collection of historic hurricane data: IBTrACS (International Best Track Archive for Climate Stewardship). There is 13,545 storms archived going back to 1842, from a total of 14 agencies. My focus will only be on hurricanes, that is: storms that at some time achieved wind speeds at or above 64 knots. Also known as Category 1 or greater.

It is definitely true that the number of detected hurricanes has increased. This is due to better sensing technology, such as aboard: radiosondes (1930s), regular transoceanic air travel (1940s), satellites (late 1960s). Due to limitations of older data, it doesn’t make any sense to consider global data before 1950.

Aside from detection, there is also a matter of how one counts the frequency of hurricanes. Does it make sense to count a 6-hour Category 3 storm the same as a 42-hour Category 3 storm?

Both classified as Category 3, but Cat3 status unequal in time.

No, it’s doesn’t make sense. Such a thing would be misleading. But that is what climate alarmists do.

A better thing to do would be to count hours spend in certain wind speed categories:

And this is exactly what I did. Here are the results:

10yr CMA means 10-Year Centered Moving Average.

What does the best hurricane data in the world show?

Category 5 has decreased

Category 4 is cyclic/no-trend

Category 3 had increased, but dropping last 25 years

Category 2 had increased, but dropping last 25 years

Category 1 is cyclic/no-trend

Now we go to category hybrids:

Category 1 & 2
Categories 3,4 & 5

Yeah, pretty much cyclic/no trend.

Category 1,2,3,4,5 is the first image in this post. No trend!

Here are stats for top 10 years for each of our categories:

Cat 1 Hours
1972 :  5187
1992 :  4533
1997 :  4230
1996 :  4149
1990 :  4059
1971 :  4047
1964 :  3924
1968 :  3711
1989 :  3564
1986 :  3402
Cat 2 Hours
1992 :  2670
1997 :  2289
2015 :  2184
2004 :  2082
1972 :  2022
2005 :  1863
1994 :  1833
1996 :  1818
2018 :  1800
1991 :  1743
Cat 3 Hours
2015 :  1752
1997 :  1692
2018 :  1686
1992 :  1650
2004 :  1569
2005 :  1434
1972 :  1416
2019 :  1359
1994 :  1344
1991 :  1311
Cat 4 Hours
2004 :  1194
2015 :  1122
2018 :  1086
1997 :  1032
1992 :  1014
2005 :   978
1996 :   966
1994 :   909
2003 :   894
2014 :   885
Cat 5 Hours
1959 :   528
1997 :   462
1958 :   408
1957 :   330
2018 :   324
1961 :   312
1965 :   291
1954 :   267
2004 :   234
1962 :   228
Cat 1+2 Hours
1972 :  6201
1992 :  5739
1997 :  5412
1996 :  5031
1990 :  4842
1971 :  4767
1964 :  4599
2015 :  4542
1968 :  4359
1994 :  4320
Cat 3+4+5 Hours
1997 :  2466
2004 :  2379
1992 :  2295
2015 :  2268
2018 :  2187
2005 :  2007
1994 :  1866
1961 :  1842
1996 :  1797
2003 :  1710
Cat 1+2+3+4+5 Hours
1972 :  6606
1992 :  6489
1997 :  6123
2015 :  5799
1990 :  5613
1996 :  5544
1971 :  5424
1964 :  5373
2018 :  5346
2004 :  5307

Climate alarmists claim that greenhouse gases create more energy for hurricanes. Well, where is that extra energy for hurricanes?

Take care, -Zoe

Update

I added more categories.

Storm hours over 20 knots (23mph, 37 km/h)

There is a decrease in hours spend above 20 knots in the satellite era.

Storm hours over 5 knots (5.8 mph, 9.3 km/h)

Also a decrease in hours spend above 5 knots. This is not even a breeze.

So, I ask again: where is the extra storm energy from carbon dioxide?

Code

# Zoe Phin, 2021/06/16
# File: ibtracs.sh
# Run: source ibtracs.sh; require; download; hurrs; stats; plotall

require() { sudo apt-get install nco gnuplot; }

download() { wget -c "https://www.ncei.noaa.gov/data/international-best-track-archive-for-climate-stewardship-ibtracs/v04r00/access/netcdf/IBTrACS.ALL.v04r00.nc"; }

view="ncks IBTrACS.ALL.v04r00.nc --trd -HC -v" 

filter() {	# Speed Filter: $1 - Min, $2 - Max
    tr -d ' ' | tr _ 0 | awk -vm=$1 -vM=$2 -F '[]=[]' '$8>=m && $8<=M {print $2"."$4}'
}

winds() { # All data sources
    $view bom_wind        | filter $1 $2 > .w01
    $view cma_wind        | filter $1 $2 > .w02
    $view ds824_wind      | filter $1 $2 > .w03
    $view hko_wind        | filter $1 $2 > .w04
    $view mlc_wind        | filter $1 $2 > .w05
    $view nadi_wind       | filter $1 $2 > .w06
    $view neumann_wind    | filter $1 $2 > .w07
    $view newdelhi_wind   | filter $1 $2 > .w08
    $view reunion_wind    | filter $1 $2 > .w09
    $view td9635_wind     | filter $1 $2 > .w10
    $view tokyo_wind      | filter $1 $2 > .w11
    $view usa_wind        | filter $1 $2 > .w12
    $view wellington_wind | filter $1 $2 > .w13
    $view wmo_wind		  | filter $1 $2 > .w14
}

yhours() { 
    winds $1 $2
    sort -nu .w* | awk -F. '{print $1}' | uniq -c | awk '{print $2" "$1}' > storm.obs
    sed -f storm.year storm.obs | awk '{Y[$1]+=$2} END { for (y in Y) print y" "Y[y]*3}'
}

hurrs() {
    $view season | tr -d ' ' | awk -F '[]=[]' '{print "s/^"$4" / "$6" /"}' | sed \$d > storm.year
    yhours 137 999 > cat5.hours
    yhours 113 136 > cat4.hours
    yhours  96 112 > cat3.hours
    yhours  83  95 > cat2.hours
    yhours  64  82 > cat1.hours

    yhours  64  96 > cat12.hours
    yhours  96 999 > cat345.hours
    yhours  64 999 > cat12345.hours
}

stats() {
    for c in 1 2 3 4 5 12 345 12345 ; do
        f="cat$c.hours"
        sort -rnk2.1 $f | awk -vc=$c 'BEGIN{ 
            print "Cat "c" Hours" } 
    NR<11 { printf "%s : %5s\n",$1,$2 }' 
    done
}

cma() {
    cut -c6- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++) s+=$j/(p+1)
        printf "%.4f\n", s
    }}'
}

plot() {
    awk '$1>=1950{print $0}' $1 | tee .dat | cma 10 > .cma
    paste -d ' ' .dat .cma > .plt
    c=$(echo -n $1 | tr -cd 12345)
    echo "set term png size 740,480; set mytics 5
    set key outside top left horizontal
    set grid; set xrange [1950:2020]
    plot '.plt' u 1:2 t 'Category $c Hours' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > c$c.png
}

plotall() { for c in 1 2 3 4 5 12 345 12345 ; do plot "cat$c.hours"; done; }

On Albedo

Albedo is a measure of reflected incoming radiation. It’s extremely vital in climate science, and so it’s important to know what it actually is, and what its trend was.

I was taught in school that albedo is 0.3. NASA’s Earth Factsheet lists it as 0.306. I have always used 0.3 for quick calculations. Most climate science guides on the internet also use this value.

The history of albedo-finding is shown in this paper:

History of Albedo

The history generally shows a reduction in albedo. The real question is: is albedo actually reducing, or is our measurement techniques refining?

This matters a great deal, because:

A drop of as little as 0.01 in Earth’s albedo would have a major warming influence on climate—roughly equal to the effect of doubling the amount of carbon dioxide in the atmosphere, which would cause Earth to retain an additional 3.4 watts of energy for every square meter of surface area.

NASA

The radiative forcing formula to make the above quote true would have to be:

Forcing = 4.906 * ln(new_co2 / old_co2)   { W/m² }

Because 4.906*ln(2) = 3.4. IPCC uses the value of 5.35 rather than 4.906, but I have to go with NASA here.

co2levels.org reminds us that CO2 concentration has increased from ~370ppm in 2000 to ~420ppm today. The theoretical forcing since 2000 would be:

4.906*ln(420/370) = 0.622 W/m²

Back to albedo …

The best available albedo data from 2000 to 2021 is available from CERES ( “the only project worldwide whose prime objective is to produce global climate data records of Earth’s Radiation Budget” ). You can download it here, after you register here. Here is what the data shows:

Albedo Change 2000 - Now:
0.2929 - 0.2891 = 0.0038

The theoretical albedo forcing would thus be

3.4 * 0.0038/0.01 = 1.292 W/m²

Thus the albedo forcing is twice as high as CO2!

It’s because it’s twice as high that the IPCC and other climate change advocacy groups do not use albedo at all. They refer to surface albedo, which favors slight cooling, but not to the atmospheric albedo which would ruin their neat narrative.

IPCC, AR5 Figure 8.17

The atmospheric albedo forcing from 2000 to 2021 is ~77% of the entire theoretical CO2 forcing from 1750 to just before that report was issued in 2013.

Think about it. Take care. -Zoe

Code

# Zoe Phin, 20201/06/01
# File: alb.sh
# Run: . alb.sh; require; download; plot

require() { sudo apt-get install -y nco gmt gnuplot; }

download() { echo "	No automated download. Follow instructions:
    Register Account at: https://urs.earthdata.nasa.gov/
    Manually Download with a web browser:
    https://asdc.larc.nasa.gov/data/CERES/EBAF/TOA_Edition4.1/CERES_EBAF-TOA_Edition4.1_200003-202103.nc
    Then move file to this directory."
}

plot() {
    cmd="ncks CERES_EBAF-TOA_Edition4.1_200003-202103.nc"
    $cmd -v gtoa_sw_all_mon --trd -HC | sed \$d | awk -F= '{print $3}' > .swa
    $cmd -v gsolar_mon --trd -HC | sed \$d | awk -F= '{print $3}' > .sun
    paste -d ' ' .swa .sun | awk '{	printf "%.3f %.4f\n", 
        2000+1/6+NR/12+1/24, $1/$2 }' | gmt gmtregress | sed 1d | tee .plt | sed -n '1p;$p' | awk '{
            printf "%.4f ", $3} END {print ""} ' | awk '{print $1" - "$2" = "$1-$2}'
    echo "set term png size 740,560
    set key out top center horizontal
    set xrange [1999.5:2021.5]
    set format y '%.03f'
    set mxtics 5; set mytics 5
    set grid 
    plot '.plt' u 1:2 t 'Albedo' w l lw 2 lc rgb '#0000EE',\
             '' u 1:3 t 'Linear Regression' w l lw 2 lc rgb '#000077'
    " | gnuplot > alb.png
}

Atlantic Hurricanes Trend

Climate alarmists claim that Atlantic hurricanes will increase in frequency and intensity due to emission of carbon dioxide. Is this true?

NOAA provides the data (HURDAT2) we need to examine this claim. Let’s first look at the frequency of hurricanes:

Hurricane Occurrences per Year

Their first claim has some evidence, but let’s give this some thought: is measuring the frequency really sensible? Wouldn’t it make more sense to measure the amount of time the Atlantic spends in hurricane mode? Yes, I think that is a better measure.

Hours of Hurricanes per Year

The amount of hours of hurricanes per year shows absolutely no trend!

What about their second claim: Is intensity increasing?

We can figure out hurricane intensity using a hurricane’s lowest pressure as a proxy. The lower the pressure the more intense the storm.

Here is all the hurricanes and their lowest pressure value:

Hurricane #’s Lowest Pressure

There is absolutely no trend in hurricane intensity in nearly 170 years!

Clearly, climate alarmists are wrong in regard to Atlantic hurricanes.

That’s all. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/05/20
# File: atlhur.sh
# Run: source atlhur.sh; require; download; freqplot; hoursplot; presplot
# Output: freq.png, hours.png, pres.png

require() { sudo apt-get install -y gnuplot; }

download() { wget -cO atl.csv "https://www.nhc.noaa.gov/data/hurdat/hurdat2-1851-2019-052520.txt"; }

cma() {
    cut -c6- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++) s+=$j/(p+1)
        printf "%.4f\n", s
    }}'
}

freqplot() {
    cat atl.csv | tr -d '\n' | sed 's/AL[0-9]/\nAL/g' | grep HU | awk '{
        print substr($1,4,4)}' | uniq -c | awk '{ 
            print $2" "$1 }' | tee freq.csv | cma 10 > freq.cma

    paste -d ' ' freq.csv freq.cma > freq.plt
    
    echo "set term png size 740,480; set mytics 2
    set key outside top center horizontal
    set grid; set xrange [1850:2020]
    plot 'freq.plt' u 1:2 t 'Hurricanes' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > freq.png
}

hoursplot() {
    awk -F, '$3=="  " && $4==" HU" { 
        print substr($1,1,4) }' atl.csv | uniq -c | awk '{ 
            print $2" "$1*6 }' | tee hours.csv | cma 10 > hours.cma

    paste -d ' ' hours.csv hours.cma > hours.plt
    
    echo "set term png size 740,480; set mytics 5
    set key outside top left horizontal
    set grid; set xrange [1850:2020]
    plot 'hours.plt' u 1:2 t 'Hurricane Hours' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > hours.png
}

presplot() {
    cat atl.csv | awk -F, 'NF==4 { print $1 } NF>4 && $3=="  " && $4==" HU" && $8!=" -999" { print $8
        }' | tr -d '\n' | sed 's/AL/\nAL/g' | awk 'NF>1{print}' | awk '{
            min=9999; for (i=2;i<=NF;i++) if ($i < min) min=$i; printf "%04d %d\n", NR, min
        }' | tee pres.csv | cma 50 > pres.cma

    paste -d ' ' pres.csv pres.cma > pres.plt

    echo "set term png size 740,480; set mytics 2
    set key outside top left horizontal
    set grid; set yrange [1005:880]
    set xrange [0:611]; set xlabel 'Hurricane #'
    set xtics nomirror; set x2tics ('1853' 1, '1900' 73, '1950' 182, '(Year)' 305, '2000' 466,'2019' 606)
    plot 'pres.plt' u 1:2 t 'Lowest Pressure' w l lt 3 lw 2,'' u 1:3 t '50 Hurricanes CMA' w l lt 6 lw 4
    " | gnuplot > pres.png
}

Land Drought Index Trend

The Standard Precipitation-Evapotranspiration Index (SPEI) data gives us anomaly drought conditions over land spanning from 1901 to 2018 (inclusive) in monthly 0.5 degree latitude/longitude format. Today I combined all this grid data into a global land-only drought anomaly index and show its trend over time. Result:

SPE Index

Looks like it’s getting dryer over land, but it also looks cyclical. Time will tell.

Enjoy 🙂 -Zoe

Chart data archived here.

Update

Northern Hemisphere
Southern Hemisphere
Tropics
Poles

Drying in NH. Tiny drying in SH. Drying in the tropics. Wetter at the poles.

Note: Tropics = abs(latitude) < 23.5, Poles = abs(latitude) > 66.5

Code

# Zoe Phin, 2021/05/19
# File: spei.sh
# Run: source spei.sh; require; download; alltime; plot
# Output: spei.csv, spei.yoy, spei.png

require() { sudo apt-get install -y nco gmt; }

download() {
    wget -O spei.nc --no-check-certificate https://digital.csic.es/bitstream/10261/202305/2/spei01.nc
}

onetime() {
    ncks -HC --trd -v spei spei.nc -d time,$1,$1 | sed \$d | awk -F '[= ]' '
    $8 != "_" { 
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
        A=(a/2*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2
        SA+=A; S+=$8*A
    } END { print S/SA }'
}

alltime() {
    for t in {0..1415}; do
        awk -vt=$t 'BEGIN{printf "%6.2f ", 1901+t/12+1/24}'
        onetime $t
    done | tee spei.csv
}

annual() {
    cat spei.csv | sed \$d | awk '{
        Y[substr($1,1,4)] += $2/12
    } END {
        for (y in Y) printf "%4d %.4f\n", y, Y[y]
    }'
}

yoy() {
    cat spei.csv | cut -c9- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++)
            s+=$j/(p+1)
        printf "%.4f\n", s
    }}' > spei.yoy
}

plot() { 
    yoy 120; paste -d ' ' spei.csv spei.yoy > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal
        set ytics format '%4.2f'
        set mxtics 2; set mytics 5
        set xrange [1900:2020]
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Wetter' w filledcurve above y1=0 lw 2 lc rgb '#0000DD',\
                     '' u 1:2 t 'Dryer'  w filledcurve below y1=0 lw 2 lc rgb '#DD0000',\
                     '' u 1:3 t '10yr CMA' w lines lw 3 lc rgb 'black'		
    " | gnuplot > spei.png 
}

Warming near Coal Plants

I was learning some Python over the weekend, and then afterwards decided to go for a jog at my favorite spot. As I was jogging, I thought about finding out what the temperature trend at this location was by extracting and plotting it with Python. This site seems to offer the best local 4km data for the continental US.

My first Python program:

import requests as req
import numpy as np
import matplotlib.pyplot as mp

loc = {
	'spares': '4km', 'call': 'pp/yearly_timeseries', 'proc': 'gridserv', 'units': 'si',
	'stats': 'tmean', 'start': '1895', 'end': '2019', 'lon': '-84.3922', 'lat': '34.0000'
}

res = req.post('https://prism.oregonstate.edu/explorer/dataexplorer/rpc.php', data=loc)

tmean = res.json()['result']['data']['tmean']

y = np.array(tmean)
x = np.arange(1895,2020)
mp.plot(x,y)
m, b = np.polyfit(x, y, 1)
mp.plot(x, m*x+b)

mp.tight_layout()
mp.savefig('loc.png')

Result:

Azalea Park, Roswell, GA

Well look at that: no warming where I like to jog. How about downtowns of some big cities?

OK. Definitely warming here.

Then I had a great idea. What if I chose spots with coal plants?

I got a list of the top 20 coal plants in the USA from here.

Results:

Bowden               -0.0076
Scherer              -0.0084
Gibson               -0.0098
Monroe               -0.0005
Amos                 -0.0020
Miller               -0.0071
Parish               +0.0044
Cumberland           -0.0059
Gavin                -0.0071
Rockport             -0.0030
Paradise             -0.0048
Roxboro              -0.0090
Sammis               -0.0005
Stuart               -0.0070
Navajo               +0.0116
Sherburne            +0.0032
Martin               -0.0048
Belews               -0.0045
Jeffrey              -0.0062
Gaston               -0.0057

Units are degrees Celcius per year, based off the linear regression.

Only 3 out of 20 coal plant locations experienced warming in the last century. 85% had cooling!

I suspect this fact will not cause climate alarmists to pause and think for one second.

What do skeptics think?

Enjoy 🙂 -Zoe

Corrections:

Above, Sammis image turned out to be a duplicate of Roxboro. Corrected.

Also, a typo: LogAngeles = Los Angeles

Code:

# Zoe Phin, 2021/05/09
# File: prism.py
# Setup: sudo apt-get install python3-numpy python3-matplotlib
# Run: python3 prism.py

import requests as req
import numpy as np
import matplotlib.pyplot as mp

locations = {
'Roswell':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34','lon':'-84.385'},
'NYC':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'40.7', 'lon':'-74'},
'LosAngeles':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34.0543', 'lon':'-118.2438'},
'WashingtonDC':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.8968','lon':'-77.0366'},
'Chicago':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'41.8758','lon':'-87.6191'},

'Bowden':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34.1271','lon':'-84.9155'},
'Scherer':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.0667','lon':'-84.8'},
'Gibson':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.3697','lon':'-87.7702'},
'Monroe':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'41.8881','lon':'-83.3453'},
'Amos':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.4746','lon':'-81.8233'},

'Miller':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.6327','lon':'-87.0595'},
'Parish':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'29.4797','lon':'-95.6320'},
'Cumberland':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.3906','lon':'-87.6537'},
'Gavin':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.9366','lon':'-82.1162'},
'Rockport':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'37.9268','lon':'-87.0354'},

'Paradise':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'37.2585','lon':'-86.9799'},
'Roxboro':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.4840','lon':'-79.0725'},
'Sammis':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'40.5686','lon':'-80.4311'},
'Stuart':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.6375','lon':'-83.6921'},
'Navajo':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.9054','lon':'-111.3877'},

'Sherburne':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'45.3795','lon':'-93.8960'},
'Martin':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'32.2597','lon':'-94.5692'},
'Belews':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.2824','lon':'-80.0592'},
'Jeffrey':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'39.2857','lon':'-96.1166'},
'Gaston':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.2430','lon':'-86.4595'}}

for name in locations:
    res = req.post('https://prism.oregonstate.edu/explorer/dataexplorer/rpc.php',data=locations[name])

    y = np.array(res.json()['result']['data']['tmean'])
    x = np.arange(1920,2020)

    mp.title(name)
    mp.plot(x, y)
    m, b = np.polyfit(x, y, 1)
    print("%-20s %+.4f" % (name, m))
    mp.plot(x, m*x+b)

    mp.tight_layout()
    mp.savefig(name+'.png')
    mp.cla()

Snow in the Era of Global Warming

Is anyone curious to know what the global snowfall trend was in this era of “extreme” global warming?

I was. Luckily NASA covertly provides us with all the necessary data to figure this out.

March 2021

I downloaded all available monthly images from 1980 to 2020 (inclusive), such as the one shown above, then I converted the pixel colors back to data using the provided scale.

The error margin is small and time-persistent and so this is a clever way to extract a rich dataset which I haven’t been able to find anywhere else.

As far as I know, you will not see this anywhere else. All other snowfall or snow-cover datasets are limited by region or date and so researchers reach the wrong conclusion!

Here is the result of my quest:

Global Snowfall
2.773 -> 2.854 is +2.90%

Snowfall has increased by nearly 3 percent over the last four decades!

Units are milligrams per square meter per second.

Let’s also see how this breaks down by North and South hemisphere:

North Hemisphere Snowfall
2.722 -> 2.468 is -9.35%
South Hemisphere Snowfall
2.824 -> 3.239 is +14.71%

SH increase in snow more than compensates NH decrease in snow. This led to an overall increase in snow during our great era of global warming!

Chart data is archived @ https://pastebin.com/raw/XpsVwdjj

That’s it. Enjoy 🙂 -Zoe

Code:

# Zoe Phin, v2 - 2021/05/07
# File: snow.sh
# Run: source snow.sh; download; index; plots
# Output: snow.png

require() { sudo apt-get install -y gmt gnuplot netpbm; }

download() {
    for y in {1980..2020}; do
        for m in {01..12}; do
            d="$y-$m-01"
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MERRA2_Snowfall_Monthly&TIME=$d'"
        done
    done > sets.sh
    bash sets.sh
}

scale() {
    for m in {01..12}; do
        pngtopnm 2020-$m-01.png | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
            printf "%03d %03d %03d\n", $1, $3, $2}' 
    done | sort -r | uniq | awk '{
        printf "s/%s %s %s/%0.2f/\n", $1, $3, $2, (NR-1)/140*7 }' > replace.sed
}

all() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)-89.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

nhs() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 1,32400p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)-89.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

shs() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 32401,64800p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)+0.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

index() {
    scale
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; all $f; done | tee all.csv
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; nhs $f; done | tee nhs.csv
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; shs $f; done | tee shs.csv
}

linear() {
    cat $1.csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f -> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    echo -n "$1: "
    linear $1; paste -d ' ' $1.csv .lin > plot.csv
    echo "set term png size 740,470
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%.1f'
        set xtics 157788864
        set ytics 0.2; set mxtics 5; set mytics 2
        set xrange ['1979-11-01':'2021-03-01']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:(10*\$2) t 'Snowfall (mg/m²/s)' w lines lw 2 lc rgb '#0000CC',\
                     '' u 1:(10*\$3) t 'Linear Regression'  w lines lw 3 lc rgb '#000055'		
    " | gnuplot > snow-$1.png 
}

plots() { plot all; plot nhs; plot shs; }

archive() {
    ( echo "Date,Global,NH,SH"; 
    paste all.csv nhs.csv shs.csv | awk '{print $1","$2","$4","$6}' ) > data.csv
}

Tiniest Useful Executable

A long long time ago, when I got my first job on wall street, I had the opportunity to partake in writing some of the fastest high frequency financial execution engines possible. Although I barely remember the x86 assembly language, I still haven’t completely forgotten it either.

I wanted to use some of these forgotten skills to write some fast climate data gathering programs because there’s too much bloat in existing software, but honestly, it’s just too tough and I don’t have the time. But in trying, I wrote the tiniest useful program possible on the x86 platform.

While some girls like to solve standard puzzles, I’ve always enjoyed putting my autistic skills to finding new exotic puzzles to solve.

So what puzzle am I trying to solve?

A: Creating the smallest program on linux that actually does something useful. This program must be its own executable and not just text interpreted by another program.

What can be more useful than a being able to do anything a computer can do?

How about a compiler? …. What kind of a compiler?

How about compiling 1’s and 0’s into runable code?

When was the last time anyone actually programmed by typing only 1’s and 0’s on the keyboard?

Well, I’m bringing it back!

First I needed to find out how to make the smallest runable program on linux, and its filesize. Luckily someone figured this out in a very clever way. Thank you, Brian Raiter.

Here, at last, we have honestly gone as far as we can go. There is no getting around the fact that the 45th byte in the file, which specifies the number of entries in the program header table, needs to be non-zero, needs to be present, and needs to be in the 45th position from the start of the ELF header. We are forced to conclude that there is nothing more that can be done.

Brian Raiter

Here is Brian’s solution:

; tiny.asm - Brian Raiter
BITS 32
        org     0x00010000

        db      0x7F, "ELF"  
        dd      1           
        dd      0           
        dd      $$          
        dw      2           
        dw      3           
        dd      _start      
        dd      _start      
        dd      4          
_start:
        mov     bl, 42    ; M
        xor     eax, eax  ; E
        inc     eax       ; A   
        int     0x80      ; T
        db      0
        dw      0x34          
        dw      0x20  
        db      1        

; -----------------------     
; $ nasm -f bin -o a.out tiny.asm
; $ chmod +x a.out
; $ ./a.out ; echo $?
; 42
; $ wc -c a.out
;      45 a.out

This 45 byte program (tiniest possible) simply sets the exit code to 42, and then exits. The real meat of his program is just 7 bytes, and can actually be 5 (no xor eax,eax) with the same result.

While Brian used NASM, I prefered to use FASM, since I’m more familiar with it.

Now here’s my tiny masterpiece:

;   Zoe Phin, 2021/04/24
;   bex.asm
    use32
    org         0x505A4000
      
    db          0x7F, "ELF"     
    dd          1                               
    dd          0                               
    dd          $$                              
    dw          2               
    dw          3               
    dd          start           
    dd          start           
    dd          4               

    start:      pop     ecx     
                pop     edi
                jmp     .end    

    .arg:       pop     esi     

        .char:  lodsb
                cmp     al, 0
                jmp     .cont   

    dw          32              
    dd          1               

        .cont:  je      .done
                rcr     al, 1
                adc     ah, ah
                jmp     .char

        .done:  mov     [edi+edx], ah
                inc     edx
                xor     eax,eax

    .end:       loop    .arg
                jmp     edi

I call it bex, short for Binary EXecution. What does it do?

It converts every argument consisting of ASCII 1’s and 0’s on the command line into its binary representation, and then jumps execution to the first argument.

So for example, the shortest program would just be just to exit cleanly without segfault:

$ ./bex 01000000 11001101 10000000

The ASCII binary represents opcodes for

inc eax    ; 01000000
int 80h    ; 11001101 10000000

Brian’s program would be:

mov bl, 42 ; 10110011 00101010
inc eax    ; 01000000
int 80h    ; 11001101 10000000

Run it:

$ ./bex 10110011 00101010 01000000 11001101 10000000
$ echo $?
42
$

And now I test a semi-Quine:

$ ./bex 10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000 | xxd -b -c10 | cut -c11-99

10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000

It’s a semi-Quine because the output is a binary representation of the ASCII input, but not the ASCII input itself.

The assembly code equivalent is:

mov     al, 4    ; 10110000 00000100
inc     ebx      ; 01000011
mov     ecx,edi  ; 10001001 11111001
int     80h      ; 11001101 10000000
xchg    eax,ebx  ; 10010011
int     80h      ; 11001101 10000000

A few notes:

  • Leading 0’s are not needed in an argument ( ex: 101 = 00000101 )
  • 1’s and 0’s are not specifically needed. 1 is an odd ASCII value, while 0 is an even ASCII value
  • Any non-binary data can also be placed directly inline

Example of notes:

$ ./bex 7.77.... 188 1....11 10110010 ++.+ 30003003 1111---1 121221 11414441 11661121 9TTTTTTT 1RR1PP11 11351171 9:::.::: "Hello World!"

Hello World!

Assembly equivalent:

mov     al, 4    ; 10110000 00000100 ; 7.77.... 188
inc     ebx      ; 01000011          ; 1....11
mov     dl, 13   ; 10110010 00001101 ; 10110010 ++.+
mov     ecx,esi  ; 10001001 11110001 ; 30003003 1111---1
sub     ecx,edx  ; 00101001 11010001 ; 121221 11414441 
int     80h      ; 11001101 10000000 ; 11661121 9TTTTTTT
xchg    eax,ebx  ; 10010011          ; 1RR1PP11
int     80h      ; 11001101 10000000 ; 11351171 9:::.:::

The code basically tells linux to write starting 13 characters from the end.

You can obtain bex by writing the binary directly and making it executable, like so:

$ echo -en $(printf "\\\x%s" 7F 45 4C 46 01 00 00 00 00 00 00 00 00 40 5A 50 02 00 03 00 20 40 5A 50 20 40 5A 50 04 00 00 00 59 5F EB 1A 5E AC 3C 00 EB 06 20 00 01 00 00 00 74 06 D0 D8 10 E4 EB ED 88 24 17 42 31 C0 E2 E4 FF E7) > bex; chmod +x bex

Or you can execute a script to download FASM and compile and test from source code:

$ bash <<< $(wget -qO- https://pastebin.com/raw/Tvmg0nr6 | tr -d '\r')

Successful result:

flat assembler  version 1.73.27  (16384 kilobytes memory)
2 passes, 66 bytes.
Input:  10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000
Output: 10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000
Match!

The tiniest useful executable is only 66 bytes!

Mission accomplished. Puzzle solved.

You can do pretty much anything in it! Its inconvenience is a feature 😉

A few notes on the run environment:

  • edx is set to number of arguments
  • edi points to first argument
  • esi points to NULL located after arguments, just before ENVironment variables.

Yes, this is obviously a 32-bit program, but it runs perfectly fine on x86-64 based linux.

Enjoy 🙂 -Zoe

P.S.: As a bonus, bex includes my initials directly in the binary:

00000000: 7F 45 4C 46 01 00  .ELF..
00000006: 00 00 00 00 00 00  ......
0000000c: 00 40 5A 50 02 00  .@ZP..
00000012: 03 00 20 40 5A 50  .. @ZP
00000018: 20 40 5A 50 04 00   @ZP..
0000001e: 00 00 59 5F EB 1A  ..Y_..
00000024: 5E AC 3C 00 EB 06  ^.<...
0000002a: 20 00 01 00 00 00   .....
00000030: 74 06 D0 D8 10 E4  t.....
00000036: EB ED 88 24 17 42  ...$.B
0000003c: 31 C0 E2 E4 FF E7  1.....

4 Decade Global Snowfall Trend

NOTE: THIS POST AND CODE IS DEPRECATED. PLEASE GO HERE FOR UPDATED VERSION.

Is anyone curious to know what the global snowfall trend was in this era of “extreme global warming”?

I was. Luckily NASA covertly provides us with all the necessary data to figure this out.

January 2021

I downloaded all available monthly images from 1980 to 2020 (inclusive), such as the one shown above, then I converted the pixel colors back to data using the provided scale.

The error margin is small and time-persistent and so this is a clever way to extract a rich dataset which I haven’t been able to find anywhere else.

As far as I know, you will not see this anywhere else. All other snowfall or snow-cover datasets are limited by region or date and so researchers reach the wrong conclusion!

Here is the result of my quest:

Global Snowfall
0.2773 -> 0.2854 is +2.90%

Snowfall has increased by nearly 3 percent over the last four decades.

Let’s also see how this breaks down by North and South hemisphere:

North Hemisphere Snowfall
0.2722 -> 0.2468 is -9.35%
South Hemisphere Snowfall
0.6257 -> 0.7057 is +12.77%

That’s it. Enjoy 🙂 -Zoe

NOTE: THIS POST AND CODE IS DEPRECATED. PLEASE GO HERE FOR UPDATED VERSION.

Correction:

The units are in kilograms divided by 100000 (better known as centigrams), not kilograms. I forgot about the original scale and mislabeled the charts. I won’t fix the chart. The purpose was to find the trend.

Update:

You can generate your own charts using data archived here.

Code:

# Zoe Phin, 2021/05/04
# File: snow.sh
# Run: source snow.sh; download; index; plot
# Output: snow.png

require() { sudo apt-get install -y netpbm gmt; }

sets() {
    for y in {1980..2020}; do
        for m in {01..12}; do
            d="$y-$m-01"
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MERRA2_Snowfall_Monthly&TIME=$d'"
        done
    done > sets.sh
}

download() { 
    sets; bash sets.sh; 
    find . -name \*png -type f -size -10k -exec rm {} \;
}

scale() {
    for m in {01..12}; do
        pngtopnm 2020-$m-01.png | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
            printf "%03d %03d %03d\n", $1, $3, $2}' 
    done | sort -r | uniq | awk '{
        printf "s/%s %s %s/%0.2f/\n", $1, $3, $2, (NR-1)/140*7 }' > replace.sed
    echo "s/000 000 000/999/" >> replace.sed
}

onefile() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 1,64800p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | sed 's/... ... .../999/' | awk '$1!=999{
        l=sprintf("%d",(NR-1)/360)-89.5
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA+=A; S+=$1*A
    } END {
        printf "%.6f\n", S/SA
    }'
}

index() {
    scale
    for f in $(ls -1 [12]*.png); do
        echo -n "${f/.png/} "
        onefile $f
    done | tee .csv
}

linear() {
    cat .csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f -> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "set term png size 740,560
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%.2f'
        set xtics 157788864
        set ytics 0.02; set mxtics 5; set mytics 2
        set xrange ['1979-11-01':'2021-03-01']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Snowfall (kg/m²/s)' w lines lw 2 lc rgb '#0000CC',\
                     '' u 1:3 t 'Linear Regression'  w lines lw 3 lc rgb '#000055'		
    " | gnuplot > snow.png 
}

Trend in Gross Primary Production

I downloaded all 8-day increment Gross Primary Production data from NASA from 2001 to 2020 (inclusive) to find the 20-year trend.

April 23, 2021

The Result:

+9.17% Rise in Gross Primary Product
0.0225 --> 0.0246, is +9.17%

This is all good news. Enjoy 🙂 -Zoe

The Code:

# Zoe Phin, 2021/05/03
# File: prod.sh
# Run: source prod.sh; download; index; plot
# Output: trend info & prod.png

require() { sudo apt-get install -y netpbm gmt; }

sets() {
    for y in {2001..2020}; do
        let n=0
        for w in {0..45}; do
            d=$(date -d "$y-01-01 +$n days" +%Y-%m-%d)
            let n+=8
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MODIS_Terra_L4_Gross_Primary_Productivity_8Day&TIME=$d'"
        done
    done > sets.sh
}

download() { 
    sets; bash sets.sh; 
    find . -name \*png -type f -size -10k -exec rm {} \;
}

scale() {
    awk 'BEGIN{
        for (i=100;i<=255;i++) printf "%03d 000 000\n", i
        for (i=000;i<=255;i++) printf "255 %03d 000\n", i
        for (i=255;i>=000;i--) printf "%03d 255 000\n", i
        for (i=000;i<=255;i++) printf "000 255 %03d\n", i
        for (i=255;i>=000;i--) printf "000 %03d 255\n", i
    }' | awk '{printf "s/%s/%0.6f/\n", $0, (NR-1)/1180*0.095 }' > replace.sed
    echo "s/000 000 000/999/" >> replace.sed
}

onefile() {
    scale; pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | sed 's/... ... .../999/' | awk '$1!=999{
        l=sprintf("%d",(NR-1)/360)-89.5
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA+=A; S+=$1*A
    } END {
        printf "%.6f\n", S/SA
    }'
}

index() {
    for f in $(ls -1 2*.png); do
        echo -n "${f/.png/} "
        onefile $f
    done | tee .csv
    cp .csv bak.csv
}

linear() {
    cat .csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f --> %.4f, is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal reverse
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01; set mxtics 2; set mytics 5
        set xrange ['2000-12-01':'2020-02-01']
        set grid xtics mxtics ytics mytics
        plot 'plot.csv' u 1:2 t 'Gross Primary Production (kg C/m²)' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#005500'		
    " | gnuplot > prod.png 
}

Global Water Dynamics

I think this dataset is very interesting. We can monitor global water level changes from 1999 to 2020 layered on top of google maps here. I naturally chose the Maldives as the starting point to see if the tiny islands nation is “doomed” as climate alarmists like to claim. Of course it isn’t.

Male, Maldives

Seldom any water level gains and plenty of reclamation. You are free to browse to other nearby islands and see that indeed there is very little encroachment of water.

The story is a little different for Bangladesh:

South Bangladesh

There is some water gain in certain areas, but there is also loss in other areas. There is little evidence of encroachment from the ocean. Just river dynamics.

Florida:

South Florida

I don’t see much coastal water encroachment.

I’d love to run some analysis on this data, but the amount of it is simply overwhelming. Safe to say that since these data researchers didn’t quantify coastal water gain/loss for us it most likely doesn’t favor the alarmist position.

Anyways, Enjoy 🙂 -Zoe

Sorry I’ve been very busy lately.

Small Mystery Explained

The official energy budget contains a small value that some may not understand.

Source

What is does that “0.6 net absorbed” value mean?

The answer is that most energy budgets mix normal (“all”) sky and clear sky conditions.

To show this, I will use NOAA’s 20th Century Version 3 Reanalysis data. The code is short:

# Zoe Phin, 2021/03/27
# File: point6.sh
# Run: source point6.sh; require; download; parse; point6

require() { sudo apt-get install -y nco; }

download() {
    wget -c ftp://ftp2.psl.noaa.gov/Datasets/20thC_ReanV3/Monthlies/sfcFlxSI-MO/ulwrf.sfc.mon.mean.nc
    wget -c ftp://ftp2.psl.noaa.gov/Datasets/20thC_ReanV3/Monthlies/clrSkyFlxSI-MO/csulf.sfc.mon.mean.nc
}

avg() {
    awk -F '[= ]' '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; l=$4-0.5
        S+=$8*(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2/510.044e6
    } END { printf "%8.4f\n", S }'
}

parse() {
    rm -f .sfclw .clslw
    cmd="ncks --trd -HC -d lat,1,179 -d time," 
    for t in {0..2159}; do
        echo $t
        $cmd$t,$t -v ulwrf ulwrf.sfc.mon.mean.nc | avg >> .sfclw
        $cmd$t,$t -v csulf csulf.sfc.mon.mean.nc | avg >> .clslw
    done
}

point6() {
    paste .sfclw .clslw | awk '{printf "%4d %s\n", 1836+(NR-1)/12, $1-$2}' | awk '
    { A[$1]+=$2/12 } END { for (y in A) printf "%4d %8.4f\n", y, A[y] }' | column -c 64
}

The result is for years 1836 to 2015 is:

1836 0.693 1881 0.683 1926 0.677 1971 0.675
1837 0.694 1882 0.678 1927 0.674 1972 0.665
1838 0.692 1883 0.681 1928 0.676 1973 0.670
1839 0.694 1884 0.676 1929 0.674 1974 0.669
1840 0.696 1885 0.680 1930 0.673 1975 0.675
1841 0.694 1886 0.684 1931 0.680 1976 0.668
1842 0.694 1887 0.683 1932 0.671 1977 0.671
1843 0.692 1888 0.681 1933 0.669 1978 0.675
1844 0.693 1889 0.686 1934 0.668 1979 0.665
1845 0.691 1890 0.688 1935 0.671 1980 0.667
1846 0.690 1891 0.685 1936 0.673 1981 0.667
1847 0.692 1892 0.690 1937 0.662 1982 0.670
1848 0.694 1893 0.685 1938 0.670 1983 0.667
1849 0.692 1894 0.683 1939 0.666 1984 0.665
1850 0.689 1895 0.679 1940 0.670 1985 0.669
1851 0.688 1896 0.681 1941 0.673 1986 0.669
1852 0.689 1897 0.683 1942 0.677 1987 0.661
1853 0.684 1898 0.683 1943 0.675 1988 0.665
1854 0.685 1899 0.681 1944 0.676 1989 0.666
1855 0.682 1900 0.685 1945 0.670 1990 0.667
1856 0.687 1901 0.682 1946 0.677 1991 0.668
1857 0.688 1902 0.684 1947 0.683 1992 0.676
1858 0.688 1903 0.695 1948 0.678 1993 0.672
1859 0.686 1904 0.686 1949 0.673 1994 0.669
1860 0.691 1905 0.687 1950 0.668 1995 0.668
1861 0.693 1906 0.685 1951 0.672 1996 0.668
1862 0.686 1907 0.686 1952 0.668 1997 0.660
1863 0.692 1908 0.690 1953 0.672 1998 0.669
1864 0.694 1909 0.691 1954 0.684 1999 0.665
1865 0.688 1910 0.690 1955 0.683 2000 0.661
1866 0.695 1911 0.692 1956 0.675 2001 0.661
1867 0.689 1912 0.688 1957 0.674 2002 0.663
1868 0.688 1913 0.680 1958 0.675 2003 0.663
1869 0.693 1914 0.679 1959 0.676 2004 0.667
1870 0.691 1915 0.680 1960 0.674 2005 0.662
1871 0.687 1916 0.690 1961 0.673 2006 0.663
1872 0.690 1917 0.681 1962 0.673 2007 0.669
1873 0.686 1918 0.681 1963 0.670 2008 0.662
1874 0.691 1919 0.685 1964 0.679 2009 0.659
1875 0.681 1920 0.685 1965 0.674 2010 0.669
1876 0.687 1921 0.678 1966 0.676 2011 0.661
1877 0.682 1922 0.682 1967 0.676 2012 0.655
1878 0.684 1923 0.675 1968 0.673 2013 0.658
1879 0.682 1924 0.683 1969 0.670 2014 0.656
1880 0.679 1925 0.672 1970 0.673 2015 0.656

Ah, but that’s not 0.6 you say! Well, it’s not always 0.6. This energy budget has it at 0.71:

Imbalance ranges from 0.61 to 0.81

And this classic 1997 budget has it at 0.9:

The “imbalance” is just a difference between normal (“all”) sky and clear sky. This “imbalance” is just the “greenhouse effect” of clouds. And what a small and pathetic “greenhouse effect” it is.

In my opinion, it’s not a greenhouse effect at all, but the extra amount of energy it takes to have clouds in the first place. I feel the standard interpretation confuses cause and effect. Maybe I’m wrong. If so, someone please explain how it doesn’t take energy to create clouds.

That’s all for now. Enjoy 🙂 -Zoe

Trend of Chlorophyll in Water

NASA has a data product that tracks the amount of chlorophyll in water across the globe.

I downloaded all available 2003-2020 (inclusive) monthly data in 1440 by 720 pixel format to see how chlorophyll in water changes over time.

This task is actually not easy because there’s a lot of missing data (black pixels). I decided to use only non-missing pixels that are persistent across all 216 months. There are 27998 of them. That’s 2.7% of the globe.

This is a map of these persistent locations:

Omnipresent Chlorophyll Measures

Here’s the trend for these locations:

Change (2003,2020): 2.942 %
Linear Regression Change (2003-2020): 2.736 %

That’s a nice positive change.

I then thought about it some more, and decided to do it on a monthly basis. Here’s a map of all persistent January pixels for all 18 years:

And here is All July:

Here is the linear regression trend for every month:

01 -0.046 %
02 +4.834 %
03 +8.112 %
04 +1.008 %
05 -7.842 %
06 -1.898 %
07 +1.130 %
08 +5.267 %
09 +6.818 %
10 +1.999 %
11 +5.372 %
12 +3.786 %

Avg: +2.378 %

Enjoy 🙂 -Zoe

Update

Some more data:

Note the rise in 1997 to 1998. That was a very warm season. Look at the rise! How can anyone be against global warming, except the cult of death?

Code

# Zoe Phin, 2021/03/24
# File: chloro.sh
# Run: source chloro.sh; require; download; convert; chloro; cmap; plot; change; annual; months

require() { sudo apt-get install -y gmt gnuplot; }

download() {
    rm -f *.csv
    for y in {2003..2020}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MY1DMM_CHLORA&year=$y"
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
    awk '{print "wget -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=1440&height=720\""}' sets.csv > sets.sh
    bash sets.sh
}

convert() {
    for f in $(ls -1 2*.csv); do
        cat $f | tr ',' '\n' > ${f/.csv/.dat}
    done
    ls -1 *.dat | xargs paste -d ' ' | nl > .db
    grep -v '99999.0' .db > chloro.db
    rm -f *.dat .db
}

chloro() {
    awk 'BEGIN {
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
    } { 
        l = sprintf("%d", ($1-1)/1440)/4-89.875
        A = (a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA += A
        for (i=2; i<=NF; i++) 
            S[i] += $i*A
    } END {
        for (m in S)
            printf("%.5f\n", S[m]/SA)
    }' chloro.db | awk '{
        printf "%.2f %.5f\n", 2003+(NR-1)/12+1/24, $1
    }' > chloro.csv
}

cmap() {
    awk '{
        lat = sprintf("%d", ($1-1)/1440)/4-89.875
        lon = ($1-1)%1440*360/1440-180
        printf "%.3f %.2f\n", lat, lon
    }' chloro.db > cmap.csv
    echo "set term png size 560,360
        set margin 4,2,2,1
        set nokey 
        set yrange [-90:90]
        set xrange [-180:180]
        set ytics 30; set xtics 30
        set grid ytics xtics
        plot 'cmap.csv' u 2:(-\$1) t 'Chlorophyll' pt 5 ps 0.1 lc rgb '#00CC00'
    "| gnuplot > cmap.png 
}

plot() { 
    echo -n "Linear Regression Slope: "
    cat chloro.csv | gmt gmtregress -Fp -o5 | awk '{printf "%.5f /year\n", $1}'
    cat chloro.csv | gmt gmtregress | sed 1d | awk '{printf "%.2f %.5f %.5f\n", $1, $2, $3}' > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal
        set mxtics 2; set mytics 2
        set format y '%.2f'
        set xrange [2003:2021]
        set grid xtics mxtics ytics mytics
        plot 'plot.csv' u 1:2 t 'Chlorophyll (mg/m³)' w lines lw 2 lc rgb '#00CC66',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#006666'		
    "| gnuplot > chloro.png 
}

change() {
    echo -n "Change (2003,2020): "
    awk '{Y[substr($1,1,4)]+=$2/12} END { printf "%.3f %\n", (Y[2020]/Y[2003]-1)*100 }' plot.csv
    echo -n "Linear Regression Change (2003-2020): "
    awk '{Y[substr($1,1,4)]+=$3/12} END { printf "%.3f %\n", (Y[2020]/Y[2003]-1)*100 }' plot.csv
}

annual() {
    awk '{Y[substr($1,1,4)]+=$2/12} END { for (y in Y) printf "%s %.5f\n", y, Y[y]}' chloro.csv
}

months() {
    for m in {01..12}; do
        echo -n "$m "
        for f in $(ls -1 2*-$m-*.csv); do
            cat $f | tr ',' '\n' > ${f/.csv/.dat}
        done
        ls -1 2*-$m-*.dat | xargs paste -d ' ' | nl > .db
        grep -v '99999.0' .db | tee chloro.db | awk 'BEGIN {
            a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        } { 
            l = sprintf("%d", ($1-1)/1440)/4-89.875
            A = (a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
            SA += A
            for (i=2; i<=NF; i++) 
                S[i] += $i*A
        } END {
            for (m in S)
                printf("%.7f\n", S[m]/SA)
        }' | awk '{
        printf "%.2f %.5f\n", 2003+(NR-1), $1
        }' | gmt gmtregress | sed 1d | awk '{Y[substr($1,1,4)]+=$3/12} END { printf "%+.3f %\n", (Y[2020]/Y[2003]-1)*100 }'
    done | tee .seas
    awk '{S+=$2} END { printf "\nAvg: %+.3f %\n", S/NR }' .seas
}

Coastal Sealevel Rise

Climate alarmists are worried that the sea level is rising too fast and flooding is coming soon. You can find many data images like this on the net:

3.2 mm/year. The problem is that this is for all ocean water. If flooding is a concern, then shouldn’t we ask what is happening at the coasts? Is it different than the oceans as a whole?

I decided to find out.

I downloaded over a gigabyte of 720×361 gridded data covering 1950 to 2009. I only examine those grid cells that are adjacent to land (2808 out of 259920).

Here is my result:

1950  -4.387969	1970  -1.113571	1990  +0.478104
1951  -3.858797	1971  -0.531201	1991  +0.651717
1952  -4.040961	1972  -0.770646	1992  +0.665983
1953  -3.568315	1973  -1.020810	1993  +0.086299
1954  -3.699824	1974  -0.375272	1994  +1.093564
1955  -2.807692	1975  -0.504674	1995  +1.871986
1956  -3.675497	1976  -1.817893	1996  +3.126923
1957  -3.445263	1977  -1.405565	1997  +2.290404
1958  -3.788105	1978  -0.750346	1998  +4.180212
1959  -3.993297	1979  -1.387182	1999  +5.105531
1960  -2.135054	1980  -1.673765	2000  +4.515499
1961  -2.499847	1981  -0.377484	2001  +4.702255
1962  -2.632606	1982  -2.025174	2002  +3.391415
1963  -2.978503	1983  +0.697424	2003  +4.399230
1964  -3.627167	1984  +0.632456	2004  +4.762698
1965  -2.821440	1985  -0.085001	2005  +4.951383
1966  -2.954607	1986  -0.053231	2006  +5.608991
1967  -2.215466	1987  -0.204903	2007  +5.249474
1968  -2.939226	1988  +0.930574	2008  +7.056215
1969  -1.268895	1989  +0.929616	2009  +6.901877

Trend (mm/year): 1.68746

1.69 mm/year. As you can see the coastal trend is half the total ocean trend. Funny how greenhouse gases do that 😉

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/03/17
# File: coastal.sh
# Run: . coastal.sh ; require; download; alltime; analysis

require() { sudo apt-get install -y nco gmt; }

download() {
    wget -cO 1950s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19500103_19591227_v1.nc.gz
    wget -cO 1960s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19600103_19691227_v1.nc.gz
    wget -cO 1970s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19700103_19791227_v1.nc.gz
    wget -cO 1980s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19800103_19891227_v1.nc.gz
    wget -cO 1990s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19900103_19991227_v1.nc.gz
    wget -cO 2000s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_20000103_20090627_v1.nc.gz
}

onetime() {
    ncks --trd -HC -d time,$2,$2 -v ssha ${1}s.nc | sed \$d | awk -F '[= ]' '
    function s(a) { return sin(atan2(0,-1)*a/180) }
    function area(lat) { if (lat<0) lat*=-1; return (s(lat+0.25)-s(lat-0.25))/720 }
    NR==1 { T=$2 }
    $4>=-57 && $4<=57 {	
        if (NR != 1) {
            if ($8 == "_" && D != "_") { A=area(L);  S+=D*A;  N+=A }
            if ($8 != "_" && D == "_") { A=area($4); S+=$8*A; N+=A }
        }
        L=$4; D=$8
    } END {
        T=sprintf("%d",T)
        "date +%Y-%m-%d -d \"1900-01-01 +"T" days\"" | getline t
        printf "%s %+010.6f\n", t, S/N/10
    }'
}

alltime() { (
    for d in {1950..1990..10}; do
        for t in {000..519}; do
            onetime $d $t
        done
    done
    for t in {000..493}; do
        onetime 2000 $t
    done ) | tee sealevel.csv
}

analysis() {
    awk '{ Y[substr($1,1,4)]+=$2; N[substr($1,1,4)]+=1 
    } END { for (i in Y) printf "%.0f %+10.6f\n", i, Y[i]/N[i]
    }' sealevel.csv | tee .result | gmt gmtregress -Fp -o5 > .trend

    cat .result | column -c 60
    echo -n "Trend (mm/year): "
    cat .trend | awk '{print $1*10}'
}

81 Million Ballots

Supposedly 81 million actual people, not just ballots, voted for Joe Biden. There is ample reasons to suspect that foul play bumped this number by several million. You can find the evidence of it if you’re looking for it with an open mind. Even if not the case (doubtful), actual enthusiasm for this highly popular president is shall we say … laughable. One way to track this is the like-to-dislike ratio on White House Youtube Videos. The actual like-to-dislike ratio, not the manipulated one.

As you know, I analyzed this a short while ago in my article White House Youtube Dislike Manipulation. I am happy it inspired someone to keep a persistent watch on the most popular administration ever:

81m.org

Here we can see that Biden’s recent SOTU address was disliked by 93.31% of online voters.

The media might have you believe these dislikes are bots and thus Google is justified in removing them. Odd. So only dislikers would bother to use “bots”? That’s how engaged they are? Likers are not engaged? Current White House fans have no money to buy bots, or ask Google for a like boost?

No, I don’t think so. That excuse won’t fly. I think these are engaged people voting legitimately, something the establishment couldn’t care less about.

Biden’s address was stupid and creepy, and he deserved all legitimate dislikes for it, including mine.

But that’s just my opinion. This post is not about my political opinions. I just wanted to thank 81m.org for keeping track of this situation and acknowledging their inspiration.

Give them a visit.

Best regards,

-Zoe

Accurate Global Greening

In a previous post, Fortunate Global Greening, I used low resolution NASA data to determine changes to Vegetation Index. I did this because I didn’t want to spend 5 hours downloading 23 gigabytes of data for the highest resolution. I didn’t think this would matter that much, but unfortunately for me, it does. Here’s the new analysis:

0.3746 --> 0.3937 is +5.08%

I made two other changes. I now use proportion of land, rather than whole globe. That’s why these numbers are much larger than previously. I also use a linear regression so I don’t remove a whole year for a moving average.

The actual global greening this century, using highest resolution data, is a little over 5%, not nearly 10% as I previously found.

Semper veritas. -Zoe

Code

# Zoe Phin, 2021/03/07
# File: veg.sh  (Version 2.0)
# Run: source veg.sh; sets; download; index; plot

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_NDVI_16&year=$y" 
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv -c \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=3600&height=1800\""}' sets.csv > sets.sh
    bash sets.sh
    rm -f 201[789]-12-31.csv
}

index() {
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        awk -F, '{
            a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
            l=NR*180/1800-90.05
            A=(0.1*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
            for (i=1;i<=NF;i++) { 
                if ($i==99999) continue
                SA+=A; S+=$i*A
            }
        } END {
            printf "%.6f\n", S/SA
        }' $f
    done > .csv
}

linear() {
    cat .csv | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 } ' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f --> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "
        set term png size 740,620
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01
        set mxtics 2
        set mytics 5
        set xrange ['2000-01-01':'2021-02-28']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Vegetation Index ' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#005500'		
    " | gnuplot > veg.png 
}

Land Change in Australia

This is for my Aussie fans. I show how Australia’s landscape changed from 2001 to 2019 using best available satellite data.

Parts of Indonesia and Papua New Guinea that should appear on the map, are changed to water.

Changes:

Type                                |   2001  |   2019  |  Change |  % Chg
------------------------------------+---------+---------+---------+---------
Evergreen Needleleaf Forest         |  0.4615 |  0.4885 | +0.0270 |   +5.85%
Evergreen Broadleaf Forest          |  2.4580 |  2.5659 | +0.1079 |   +4.39%
Deciduous Broadleaf Forest          |  0.0094 |  0.0167 | +0.0073 |  +77.66%
Mixed Forests                       |  0.0570 |  0.0737 | +0.0167 |  +29.30%
Closed Shrubland                    |  3.8315 |  4.0744 | +0.2429 |   +6.34%
Open Shrublands                     | 54.0555 | 54.7268 | +0.6713 |   +1.24%
Woody Savannas                      |  1.6631 |  1.9115 | +0.2484 |  +14.94%
Savannas                            |  6.4679 |  6.6289 | +0.1610 |   +2.49%
Grasslands                          | 24.9655 | 23.3947 | -1.5708 |   -6.29%
Permanent Wetlands                  |  0.2007 |  0.2198 | +0.0191 |   +9.52%
Croplands                           |  3.4671 |  3.5146 | +0.0475 |   +1.37%
Urban and Built-up                  |  0.1321 |  0.1388 | +0.0067 |   +5.07%
Cropland/Natural Vegetation Mosaic  |  0.0065 |  0.0110 | +0.0045 |  +69.23%
Snow and Ice                        |  0.0001 |  0.0005 | +0.0004 | +400.00%
Barren or Sparsely Vegetated        |  2.2241 |  2.2339 | +0.0098 |   +0.44%

Columns are in overall percent, except the last, which shows percent change from 2001 to 2019.

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/28
# File: australia.sh
# Run: source australia.sh; require; download <user> <pass>; prepare; maps; animate; analyze

require() { sudo apt-get install hdf4-tools imagemagick; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse_mlc() {
    ncdump-hdf -v Majority_Land_Cover_Type_1 $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%02d ", $i}' | fold -w21600 > $1.mlc
}

parse_lct() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%03d ", $i}' | fold -w489600 > $1.dat
}

prepare() { parse_mct 2001; parse_mct 2019; parse_lct 2001; parse_lct 2019; }

aus() { ( echo -e 'P3\n830 680\n255'; sed -n 2001,2680p $1.mlc | awk '{
    for (x=5851; x<=6680; x++) {
        if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) printf "000 000 128 "
        else { if ($x==0) printf "000 000 128 "
            if ($x == 01) printf "024 160 064 "
            if ($x == 02) printf "041 216 082 "
            if ($x == 03) printf "156 216 083 "
            if ($x == 04) printf "158 251 183 "
            if ($x == 05) printf "151 202 178 "
            if ($x == 06) printf "193 163 181 "
            if ($x == 07) printf "244 230 206 "
            if ($x == 08) printf "219 240 188 "
            if ($x == 09) printf "249 224 000 "
            if ($x == 10) printf "239 198 160 "
            if ($x == 11) printf "087 153 208 "
            if ($x == 12) printf "246 242 153 "
            if ($x == 13) printf "251 005 000 "
            if ($x == 14) printf "156 168 128 "
            if ($x == 15) printf "250 250 250 "
            if ($x == 16) printf "195 195 195 "
        }
    } print "" }' ) > .pnm 
    convert .pnm -fill white -stroke white -pointsize 30 -gravity NorthEast -annotate 0 "$1" aus$1.png
}

maps() { aus 2001; aus 2019; }

animate() { convert -loop 0 -delay 200 aus*.png animaus.gif; }

count() {
    sed -n 2001,2680p $1.dat | awk 'BEGIN { 
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180 
    } {
        l=(NR+2001)*180/3600-90.025
        A=(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        for (t=2; t<=17; t++)
            for (x=5851; x<=6680; x++) {
                if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) continue
                S[t] += $(17*x+t)/100*A
            }
    } END {
        for (t=2; t<=17; t++) SA+=S[t]
        for (t=2; t<=17; t++) {
            printf "%07.4f\n", S[t]/SA*100
        }
    }' 
}

analyze() { 
    echo 'Evergreen Needleleaf Forest
        Evergreen Broadleaf Forest
        Deciduous Needleleaf Forest
        Deciduous Broadleaf Forest
        Mixed Forests
        Closed Shrubland
        Open Shrublands
        Woody Savannas
        Savannas
        Grasslands
        Permanent Wetlands
        Croplands
        Urban and Built-up
        Cropland/Natural Vegetation Mosaic
        Snow and Ice
        Barren or Sparsely Vegetated' | tr -d '\t' > .type

    count 2001 > .2001
    count 2019 > .2019

    echo
    echo 'Type                                |   2001  |   2019  |  Change |  % Chg' 
    echo '------------------------------------+---------+---------+---------+---------'
    paste -d, .type .2001 .2019 | awk -F, '$2!=0 {
        printf "%-35s | %7.4f | %7.4f | %+7.4f | %+7.2f%\n", $1, $2, $3, $3-$2, ($3/$2-1)*100
    }'
    echo
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Surface Change

NASA provides global land cover classification data:

2011, Source

Unfortunately it stops in 2011. I did a little bit more digging and found a great resource here. What I wanted to do was show surface changes over time. Here’s my result:

Each year column shows coverage by percent, and last column shows the percent change from 2001 to 2019.

No analysis in this post. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/25
# File: landchg.sh
# Run: source landchg.sh; require; download <user> <pass>; prepare; analyze

require() { sudo apt-get install hdf4-tools; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2010.hdf --user=$1 --password=$2 $base/2010.01.01/MCD12C1.A2010001.006.2018053185051.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
    for (i=1; i<=NF; i++) 
        printf "%03d ", $i
    }' | fold -w489600 | awk '{
    for (t=1; t<=17; t++) {
        for (l=0; l<=7199; l++)
            sum += $(17*l+t)
        printf "%.4f ", sum/7200
        sum = 0
    }
    print ""
    }' > $1.lat
}

area() { awk 'BEGIN { a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
    for (l=-89.975; l<=89.975; l+=0.05)
        printf "%.9f\n",(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2/70842.4493856
    }' > .area
}

whole() { paste -d ' ' .area $1.lat | awk '{ 
        for (i=2; i<=NF; i++) t[i] += $1*$i; 
    } END { 
        for(i in t) printf "%.4f ", t[i]
        print "" 
    }'
}

prepare() { parse 2001; parse 2010; parse 2019; }

analyze() { area; echo 'Water
    Evergreen Needleleaf Forest
    Evergreen Broadleaf Forest
    Deciduous Needleleaf Forest
    Deciduous Broadleaf Forest
    Mixed Forests
    Closed Shrubland
    Open Shrublands
    Woody Savannas
    Savannas
    Grasslands
    Permanent Wetlands
    Croplands
    Urban and Built-up
    Cropland/Natural Vegetation Mosaic
    Snow and Ice
    Barren or Sparsely Vegetated' | tr -d '\t' > .type

    whole 2001 | tr ' ' '\n' > .2001
    whole 2010 | tr ' ' '\n' > .2010
    whole 2019 | tr ' ' '\n' > .2019
    echo 'Type                                |  2001  |  2010  |  2019  |   % Chg' 
    echo '------------------------------------+--------+--------+--------+----------'
    paste -d, .type .2001 .2010 .2019 | sed '$d' | awk -F, '{
        printf "%-35s | %6.3f | %6.3f | %6.3f | %+7.3f%\n", $1, $2, $3, $4, ($4/$2-1)*100
    }'
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Us and Enceladus

Enceladus is the 6th largest moon of Saturn. It has the distinction of being the most reflective object in the solar system.

Photo by NASA’s Cassini Probe

The bond albedo of Enceladus is 0.81.

Let’s figure out what the average temperature of Enceladus should be using the standard approach. This is determined by 2 things:

  1. Insolation
  2. Longwave Radiation from Saturn.

The combined formula is:

( ( TSI*(1-Ea)/4 + (TSI*(1-Sa)/(Ds/Rs)^2)/4 )/σ )^0.25

TSI = Total Solar Irradiance

Ea = Enceladus Bond Albedo, Sa = Saturn Albedo

Rs = Saturn Radius, Ds = Distance from Saturn to Enceladus

We use data from here, here, Albedo from [Howett 2010] & Emissivity = 1 from [Howett 2003].

Do the math:

14.82*(1-0.81)/4 + (14.82*(1-0.342)/3.9494^2)/4 =

0.704 + 0.156 = 0.86

(0.86 / 5.67e-8)^0.25 = 62.4K

So 62.4 K should be the average temperature of Enceladus. Is it?

No it’s not.

The average is about 13 K higher. This just goes to show that the standard climate science approach of using greybody calculations is wrong. It’s wrong everywhere except where temperatures accidently correspond.

There is also definitely no explanation for Enceladus’ south pole aside from internal heat.

And if tiny planetary bodies have plenty of leaking internal heat, may not the Earth?

Based on data from previous flybys, which did not show the south pole well, team members expected that the south pole would be very cold, as shown in the left panel. Enceladus is one of the coldest places in the Saturn system because its extremely bright surface reflects 80 percent of the sunlight that hits it, so only 20 percent is available to heat the surface. As on Earth, the poles should be even colder than the equator because the sun shines at such an oblique angle there…

Equatorial temperatures are much as expected, topping out at about 80 degrees Kelvin (-315 degrees Fahrenheit), but the south pole is occupied by a well-defined warm region reaching 85 Kelvin (-305 degrees Fahrenheit). That is 15 degrees Kelvin (27 degrees Fahrenheit) warmer than expected. The composite infrared spectrometer data further suggest that small areas of the pole are at even higher temperatures, well over 110 degrees Kelvin (-261 degrees Fahrenheit). Evaporation of this relatively warm ice probably generates the cloud of water vapor detected above Enceladus’ south pole by several other Cassini instruments.

The south polar temperatures are very difficult to explain if sunlight is the only energy source heating the surface, though exotic sunlight-trapping mechanisms have not yet been completely ruled out. It therefore seems likely that portions of the polar region are warmed by heat escaping from the interior of the moon. This would make Enceladus only the third solid body in the solar system, after Earth and Jupiter’s volcanic moon Io, where hot spots powered by internal heat have been detected.

NASA

Don’t expect NASA to tell you how much Earth’s internal hotspots contribute to recent warming.

-Z

Trend in Global Fires

Climate alarmists claim that an increase in man-made greenhouse gas emission will cause more fires. For example …

Human-induced climate change promotes the conditions on which wildfires depend, increasing their likelihood …

ScienceDaily

Funk … says there is very well documented scientific evidence that climate change has been increasing the length of the fire season, the size of the area burned each year and the number of wildfires.

DW

The clearest connection between global warming and worsening wildfires occurs through increasing evapotranspiration and the vapor-pressure deficit.  In simple terms, vegetation and soil dry out, creating more fuel for fires to expand further and faster.

… Global warming will keep worsening wildfires …

SkepticalScience

Sounds serious. Is it true?

We show that fire weather seasons have lengthened across 29.6 million km2 (25.3%) of the Earth’s vegetated surface, resulting in an 18.7% increase in global mean fire weather season length. We also show a doubling (108.1% increase) of global burnable area affected by long fire weather seasons and an increased global frequency of long fire weather seasons across 62.4 million km2 (53.4%) during the second half of the study period.

— Nature: Climate-induced variations in global wildfire danger from 1979 to 2013

This is just about the most scientific paper I could find on the issue. Why are they obsessed with the length of the fire season? Why can’t they just answer the simple question: Is there more or less fire?

NASA has collected daily data on Active Fires since 2000.

Active Fires, March 2000 [Source]

I downloaded and analyzed all of their Active Fires data. Here’s the result:

Now it all makes sense. Climate scammers need to cherrypick locations and seasons in order to distract from the empirical truth that global fires have been decreasing. Disgusting.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/16
# File: fire.sh
# Run: source fire.sh; require; sets; download; index; plot

require() { sudo apt-get install -y gmt gnuplot; }

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD14A1_E_FIRE&year=$y"
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1 }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ S+=$1*$2 
    } END { printf "%0.4f\n", S }'
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    awk '$2>0.02 {"date +%j -d "$1 | getline t; 
        print substr($1,1,4)+t/365" "$2 }' .csv | gmt gmtregress | tee .trend | sed 1d | tr '\t' ' ' | cut -c-25 > plot.csv
    echo "
        set term png size 740,420
        set key outside top center horizontal
        set ytics format '%4.2f'
        set ytics 0.01; set mytics 5
        set xtics 2; set mxtics 2
        set xrange [2000:2021]
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Active Fires Index' w lines lw 2 lc rgb '#DD0000',\
                     '' u 1:3 t 'Linear Trend'  w lines lw 3 lc rgb '#440000'		
    "| gnuplot > fire.png 
}

Fortunate Global Greening

Update: See new information.

NASA offers a data product called a Vegetation Index. This can be used to track how green the Earth is.

February 2000, [Source]

Although many are familiar with recent global greening, I prefer to always check the source data. And so I downloaded all of their available 16-day-increment data from 2000 to 2021. Here’s my result:

0.0936 --> 0.1029 is +9.94%

10% global greening in 20 years! We are incredibly fortunate!

I just wish everyone felt that way. But you know not everyone does. To the extent that humans enhance global greening is precisely what social parasites want to tax and regulate. No good deed goes unpunished.

Anyway, Enjoy 🙂 -Zoe

P. S. The Earth is ~29% land. A Veg Index of ~0.29 would mean all covered land is heavy vegetation.

Update: See new information.

# Zoe Phin, 2021/02/16
# File: veg.sh
# Run: source veg.sh; sets; download; index; plot

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_NDVI_16&year=$y" 
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
    rm -f 201[789]-12-31.csv
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1
        }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ 
        S+=$1*$2 
    } END { 
        printf "%0.4f\n", S
    }'
}

yoy() {
    cat .csv | cut -c12- | tr '\n' ' ' | awk -vp=$1 '{
        for (i=0;i<p/2;i++) print ""
        for (i=p/2;i<=NF-p/2;i++) { s=0
            for (j=i-p/2; j<=i+p/2; j++)
                s+=$j/(p+1)
            printf "%.4f\n", s
        }
    }' > .yoy
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    yoy 22; paste -d ' ' .csv .yoy > plot.csv
    sed -n '12p;$p' .yoy | tr '\n' ' ' | awk '{printf "%s --> %s, is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
    echo "
        set term png size 740,620
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01
        set mxtics 2
        set mytics 5
        set xrange ['2000-01-01':'2020-12-31']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Vegetation Index ' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t '1-Year Moving Avg' w lines lw 3 lc rgb '#005500'		
    "| gnuplot > veg.png 
}

Average Moon Day and Night Temperatures

NASA’s Moon Fact Sheet doesn’t give the diurnal temperature range for the entire moon, just the equator:

Diurnal temperature range (equator): 95 K to 390 K

Strange. They have collected the data. Why didn’t they do the calculations? So I could do it?

I went through every 15 degree increment longitude data available here.

Day is the center hot spot +/- 90 degrees. Night is everything outside of that.

Here’s my result:

Lon    Day    Night
000: 303.914 099.629 
015: 304.115 099.809 
030: 304.250 099.569 
045: 304.342 099.402 
060: 303.527 099.818 
075: 303.196 099.688 
090: 302.704 099.543 
105: 302.347 099.650 
120: 301.705 099.676 
135: 301.474 099.267 
150: 301.550 099.314 
165: 300.939 099.281 
180: 300.458 099.378 
195: 301.062 099.347 
210: 301.293 099.516 
225: 302.147 099.307 
240: 303.114 099.249 
255: 302.813 099.433 
270: 302.921 099.221 
285: 303.267 099.054 
300: 303.318 099.161 
315: 303.682 099.245 
330: 303.588 099.397 
345: 304.116 099.122 

Avg: 302.743 099.420

Whole Moon:  201.082

As you can see the time-averaged whole moon goes from a nightly low of 99.42 K to a daily high of 302.743 K, with a 24 moon-hour average of 201.082 K.

I assume that both day and night is a 12 moon-hour period. This may not philosophically be so, but my whole purpose was to figure out the difference between light and dark equal-area hemispheres, not compare unequal light to dark areas.

I’ll contact NASA’s Fact Sheet webadmin to ask him to update.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/15
# File: moont.sh
# Run: source moont.sh; download; calc

download() {
    for l in {000..345..15}; do                   
        echo http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.xyz        
    done | wget -ci -
}

avg() {
    awk '{ f=1/581.9; e=2*f-f^2; r=atan2(0,-1)/180
        T[$2]+=$3; N[$2]+=1; A[$2]+=r/1438.355*(1-e)*cos(r*$2)/(1-e*sin(r*$2)^2)^2
    } END {for (L in T) printf "%+06.2f %7.3f %.15f\n", L, T[L]/N[L], A[L]}' | awk '{
        T+=$2*$3 } END { printf "%07.3f ", T }'
}

calc() {
    for l in {000..345..15}; do                   
        echo -n "$l: "
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && ($1>-90 && $1<90 ))			{ print }
        (l=="015" && ($1>-75 && $1<105))			{ print }
        (l=="030" && ($1>-60 && $1<120))			{ print }
        (l=="045" && ($1>-45 && $1<135))			{ print }
        (l=="060" && ($1>-30 && $1<150))			{ print }
        (l=="075" && ($1>-15 && $1<165))			{ print }
        (l=="090" && ($1>0   && $1<180))  			{ print }
        (l=="105" && ($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && ($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && ($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && ($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && ($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && ($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && ($1>105 || $1<-75))			{ print }
        (l=="210" && ($1>120 || $1<-60))			{ print }
        (l=="225" && ($1>135 || $1<-45))			{ print }
        (l=="240" && ($1>150 || $1<-30))			{ print }
        (l=="255" && ($1>165 || $1<-15))			{ print }
        (l=="270" && ($1<0 ))						{ print }
        (l=="285" && ($1<15 && $1>-165))			{ print }
        (l=="300" && ($1<30 && $1>-150))			{ print }
        (l=="315" && ($1<45 && $1>-135))			{ print }
        (l=="330" && ($1<60 && $1>-120))			{ print }
        (l=="345" && ($1<75 && $1>-105))			{ print }
        ' | avg
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && !($1>-90 && $1<90 ))			{ print }
        (l=="015" && !($1>-75 && $1<105))			{ print }
        (l=="030" && !($1>-60 && $1<120))			{ print }
        (l=="045" && !($1>-45 && $1<135))			{ print }
        (l=="060" && !($1>-30 && $1<150))			{ print }
        (l=="075" && !($1>-15 && $1<165))			{ print }
        (l=="090" && !($1>0   && $1<180))  			{ print }
        (l=="105" && !($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && !($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && !($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && !($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && !($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && !($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && !($1>105 || $1<-75))			{ print }
        (l=="210" && !($1>120 || $1<-60))			{ print }
        (l=="225" && !($1>135 || $1<-45))			{ print }
        (l=="240" && !($1>150 || $1<-30))			{ print }
        (l=="255" && !($1>165 || $1<-15))			{ print }
        (l=="270" && !($1<0 ))						{ print }
        (l=="285" && !($1<15 && $1>-165))			{ print }
        (l=="300" && !($1<30 && $1>-150))			{ print }
        (l=="315" && !($1<45 && $1>-135))			{ print }
        (l=="330" && !($1<60 && $1>-120))			{ print }
        (l=="345" && !($1<75 && $1>-105))			{ print }
        ' | avg
        echo
    done | awk '
        BEGIN { print "Lon    Day    Night" }
              { D+=$2; N+=$3; print }
        END   { printf "\nAvg: %07.3f %07.3f\n\nWhole Moon:  %07.3f", D/24, N/24, D/48+N/48}'
}

### Blog Extra ###

require() { sudo apt-get install -y imagemagick; }

dlimgs() {
    for l in {000..345..15}; do                   
        wget -O L$l.png http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.png        
    done
}

scale() { n=0
    for l in {000..345..15}; do                   
        convert -quality 30% -scale 12% L$l.png SL$(printf "%03d" $n).jpg
        let n++
    done
}

animate() { convert -delay 30 -loop 0 SL*.jpg animoon.gif; }

Big Blue Marble

I don’t know about you, but I always thought this was a beautiful image:

Blue Marble ; Source: NASA, 2012, Showing 2004.

I decided to fix it per this article, and make a large (1920×960) animated version. The result is here. It’s 7MB, so please wait for it to load. Right-click and save image in case wordpress is annoying. I made it my wallpaper, and so can you!

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/14
# File: terra.sh
# Run: source terra.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=BlueMarbleNG' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O terra$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=1920&height=960"
        let n++
    done
}

fix() { 
    for jpg in `ls -1 terra*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n1920 960\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 1,3d | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w23040 > .tmp

        cut -c1-756 .tmp > .right
        cut -c756-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 50 -loop 0 terra*.jpg animterra.gif; }

Annual Leaf Cycle

Our Beautiful Living and Breathing Planet

The map is fixed per this article.

# Zoe Phin, 2021/02/13
# File: lai.sh
# To Run: source lai.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD15A2_M_LAI&date=2016-01-01' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O lai$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=750&height=375"
        sleep 1
        let n++
    done
}

fix() { 
    for jpg in `ls -1 lai*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n750 375\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w9000 > .tmp

        cut -c1-300 .tmp > .right
        cut -c300-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 25 -loop 0 lai*.jpg animlai.gif; }

Happy Valentines Day!

❤ -Zoe

Effect of Clouds on Global Upwelling Radiation

I downloaded and analyzed 10 Gigabytes of data fully covering years 2003 to 2019 from “the only project worldwide whose prime objective is to produce global climate data records of ERB [Earth’s Radiation Budget] from instruments designed to observe the ERB” [site] [data] in order to see the effect of clouds at the surface, especially the Upwelling Longwave Radiation (LW_UP).

NASA Reminds us …

High clouds are much colder than low clouds and the surface. They radiate less energy to space than low clouds do. The high clouds in this image are radiating significantly less thermal energy than anything else in the image. Because high clouds absorb energy so efficiently, they have the potential to raise global temperatures. In a world with high clouds, much of the energy that would otherwise escape to space is captured in the atmosphere. High clouds make the world a warmer place. If more high clouds were to form, more heat energy radiating from the surface and lower atmosphere toward space would be trapped in the atmosphere, and Earth’s average surface temperature would climb.

NASA

In contrast to the warming effect of the higher clouds, low stratocumulus clouds act to cool the Earth system. Because lower clouds are much thicker than high cirrus clouds, they are not as transparent: they do not let as much solar energy reach the Earth’s surface. Instead, they reflect much of the solar energy back to space (their cloud albedo forcing is large). Although stratocumulus clouds also emit longwave radiation out to space and toward the Earth’s surface, they are near the surface and at almost the same temperature as the surface. Thus, they radiate at nearly the same intensity as the surface and do not greatly affect the infrared radiation emitted to space (their cloud greenhouse forcing on a planetary scale is small). On the other hand, the longwave radiation emitted downward from the base of a stratocumulus cloud does tend to warm the surface and the thin layer of air in between, but the preponderant cloud albedo forcing shields the surface from enough solar radiation that the net effect of these clouds is to cool the surface.

NASA

Here’s the global percent of clouds by type:

Clouds  Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
Type_1  008.379  007.999  007.839  008.140  008.443  008.367  008.345  008.524  008.550  008.229  008.157  007.984  007.999  008.028  008.256  008.465  009.469  009.641
Type_2  024.677  023.556  023.799  024.149  024.438  024.168  024.382  024.580  024.419  024.181  024.766  024.539  024.534  024.796  025.193  025.493  026.317  026.195
Type_3  036.259  035.815  035.721  035.894  036.028  035.646  036.004  036.248  035.742  035.566  036.363  036.144  036.194  036.563  036.856  036.918  037.531  037.173
Type_4  066.637  067.458  067.597  067.381  066.701  066.395  066.248  066.500  066.579  066.149  066.087  066.093  066.003  066.103  066.577  066.569  067.425  066.972
Type_5  133.275  134.917  135.194  134.763  133.403  132.790  132.496  133.001  133.157  132.298  132.173  132.186  132.007  132.206  133.154  133.139  134.851  133.944

Cloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)

The project keeps track of 4 different types of observed LW_UP: All, Clr, AllNoAero, and Pristine. All is normal observed sky. Clr (clear) is no clouds. AllNoAero is All minus aerosols. Pristine is Clr minus aerosols.

Since clouds play an important role in Earth’s supposed greenhouse effect, and this effect leads to a supposed serious warming at the surface, we should see a very large difference between all these 4 scenarios.

The results (Units = W/m²):

Series               Average     2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_up        397.445  397.191  396.820  397.667  397.222  397.033  396.243  396.924  397.166  396.364  396.883  397.063  397.361  398.266  398.894  398.455  398.166  398.848
all_sfc_lw_up        398.167  397.921  397.559  398.404  397.945  397.750  396.955  397.632  397.876  397.076  397.598  397.795  398.090  398.992  399.625  399.189  398.874  399.551
pristine_sfc_lw_up   397.387  397.135  396.763  397.610  397.165  396.974  396.182  396.866  397.107  396.306  396.825  397.006  397.305  398.207  398.836  398.397  398.106  398.790
allnoaero_sfc_lw_up  398.129  397.885  397.522  398.368  397.907  397.711  396.914  397.594  397.838  397.038  397.560  397.758  398.054  398.953  399.587  399.152  398.834  399.513

But in fact there is very little difference. The difference in surface LW_UP between a Pristine sky (no clouds, no aerosols) and All sky (see above cloud data) is just 0.78 W/m².

I would even argue it might be ZERO. It’s only not zero because a satellite can’t measure the same scenario in the same place at the same time. They can only measure some place nearby or same place at another time. Even if I’m wrong on this, this value is still very unimpressive.

Now let’s look at downwelling longwave radiation (LW_DN) and longwave radiation at the top of the atmosphere (TOA_LW):

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_dn        317.924  317.702  317.175  318.077  317.760  317.364  316.483  317.572  318.370  316.923  317.328  317.615  318.045  319.242  319.663  318.692  318.146  318.559
all_sfc_lw_dn        347.329  347.436  347.344  348.132  347.250  346.673  345.582  346.526  347.440  346.029  346.573  347.385  347.673  348.678  349.256  348.454  346.994  347.173
pristine_sfc_lw_dn   316.207  316.004  315.473  316.394  316.063  315.611  314.691  315.852  316.654  315.192  315.589  315.934  316.384  317.490  317.954  316.968  316.400  316.867
allnoaero_sfc_lw_dn  346.359  346.490  346.395  347.196  346.297  345.669  344.546  345.549  346.466  345.048  345.590  346.448  346.754  347.694  348.296  347.489  345.987  346.195

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_toa_lw_up        262.503  262.373  262.267  262.645  262.446  262.584  262.087  262.268  262.521  262.179  262.185  262.499  262.543  262.668  263.075  262.942  262.535  262.735
all_toa_lw_up        237.889  237.990  237.924  238.257  237.970  238.339  237.685  237.764  238.165  237.975  237.581  237.895  237.973  238.027  237.999  237.848  237.167  237.557
pristine_toa_lw_up   262.979  262.833  262.720  263.102  262.911  263.070  262.598  262.743  262.988  262.665  262.684  262.965  263.009  263.165  263.547  263.419  263.033  263.198
allnoaero_toa_lw_up  238.168  238.260  238.189  238.523  238.242  238.626  237.987  238.042  238.438  238.260  237.874  238.167  238.245  238.320  238.274  238.126  237.456  237.827

Let’s now compare the averages side by side for all 3:

Series               Average

clr_toa_lw_up        262.503
all_toa_lw_up        237.889
pristine_toa_lw_up   262.979
allnoaero_toa_lw_up  238.168

clr_sfc_lw_dn        317.924
all_sfc_lw_dn        347.329
pristine_sfc_lw_dn   316.207
allnoaero_sfc_lw_dn  346.359

clr_sfc_lw_up        397.445
all_sfc_lw_up        398.167
pristine_sfc_lw_up   397.387
allnoaero_sfc_lw_up  398.129

The standard greenhouse effect narrative is that infrared absorbing gases prevent radiation from reaching space and this causes warming at the surface (thus more radiation). Well we clearly see that’s not case. If clouds (water vapor + aerosols) hardly changes outgoing surface radiation, then the whole hypothesis is in error. Less top-of-atmosphere outgoing radiation doesn’t cause surface heating and thus more radiation from the surface, despite the increase in downwelling radiation.

Enjoy 🙂 -Zoe

Update 02/28

Resident Biden’s Senior Climate Advisor reminds us

We quantify the impact of each individual absorber in the total effect by examining the net amount of long‐wave radiation absorbed in the atmosphere (G, global annual mean surface upwelling LW minus the TOA LW upwelling flux) [Raval and Ramanathan, 1989; Stephens and Greenwald, 1991]. This is zero in the absence of any long‐wave absorbers, and around 155 W/m2 in the present‐day atmosphere [Kiehl and Trenberth, 1997]. This reduction in outgoing LW flux drives the 33°C greenhouse effect defined above, and is an easier diagnostic to work with.

Gavin Schmidt et al.

that the greenhouse effect (G) is just SFC_LW_UP minus TOA_LW_UP. So let’s do that for all scenarios:

clr_toa_lw_up        397.445 - 262.503 = 134.942
all_toa_lw_up        398.167 - 237.889 = 160.278
pristine_toa_lw_up   397.387 - 262.979 = 134.408
allnoaero_toa_lw_up  398.129 - 238.168 = 159.961

So there is definitely a mathematical “greenhouse effect” difference between the 4 scenarios, and yet this makes no difference to surface upwelling radiation, and by extension to surface temperature.

Varying the amount of “greenhouse effect” means nothing to surface temperature.

Since the absorption of radiation by IR active gases makes no difference to surface temperature, the greenhouse effect hypothesis is simply incorrect and should be abandoned for the sake of empirical science.

-Zoe

Code clouds.sh:

# Zoe Phin, 2021/02/09

require() { sudo apt install -y hd4-tools; }

download() { 
    mkdir -p ceres; n=4
    for y in {2003..2019}; do 
        for m in {01..12}; do
            [ $y$m -ge 201507 ] && n=5
            [ $y$m -ge 201603 ] && n=6
            [ $y$m -ge 201802 ] && n=7
            wget -O ceres/$y$m.hdf -c "https://opendap.larc.nasa.gov/opendap/hyrax/CERES/SYN1deg-Month/Terra-Aqua-MODIS_Edition4A/$y/$m/CER_SYN1deg-Month_Terra-Aqua-MODIS_Edition4A_40${n}406.$y$m.hdf"
        done
    done
}

cmd() { ncdump-hdf -l999 ceres/$1$2.hdf -v "$3"; }

lwup() { series='init_clr_sfc_lw_up init_all_sfc_lw_up init_pristine_sfc_lw_up init_allnoaero_sfc_lw_up'; lw; }
lwdn() { series='init_clr_sfc_lw_dn init_all_sfc_lw_dn init_pristine_sfc_lw_dn init_allnoaero_sfc_lw_dn'; lw; }
lwta() { series='init_clr_toa_lw_up init_all_toa_lw_up init_pristine_toa_lw_up init_allnoaero_toa_lw_up'; lw; }

lw() {
    printf "\n%-20s %-11s" Series Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    for s in $(echo $series); do
        printf "%-20s = " $s
        for y in {2003..2019}; do
            for m in {01..12}; do
                cmd $y $m ${s}_glob | sed -n 3173,+2p
            done | awk -vv="${s}_glob" '$1==v{s+=$3}END{printf "%07.3f ",s/12}'
        done
        echo
    done | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
    }' | sed -r 's/init_|adj_//' | column -t
}

clouds() {
    rm -f .m* .y* .cld
    printf "\n%-7s %-11s" Clouds Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    printf "Type_%d =\n" $(seq 5) > .cld
    for y in {2003..2019}; do 
        for m in {01..12}; do 
            cmd $y $m obs_cld_amount_glob | sed -n 3173,+2p | grep -o '[0-9].*[0-9]' | tr ',' '\n' > .m$m
        done 
        paste .m* | awk '{ S=0; for(i=1;i<=NF;i++) s+=$i; printf "%07.3f\n", s/12 }' > .y$y
    done
    ( 	paste -d ' ' .cld .y* | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
        }' | column -t
        echo -e '\nCloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)'
    )
}

Run:

$ source clouds.sh; require && download
$ clouds; lwup; lwdn; lwta

Greenhouse Gases as Coolants

There, I said it. Don’t believe me? I will show you …

NASA offers an online tool for measuring the effects of clouds, aerosols, and greenhouse gases.

Set Output to OUTPUT_details. Note the CO2 (ppmv) setting in the bottom left. Click Compute button to express form changes. Output comes below the form, so scroll down. Result:

Purple Ellipse = LWUP @ one meter above surface

I wrote a program to see changes to Upwelling Longwave Radiation (LWUP) at 1 meter above surface under different CO2 ppmv settings and zones. Here is the result:

PPM  Trop   MLS    MLW    SAS    SAW
 15 456.36 421.41 309.39 382.31 246.71 
 30 456.35 421.41 309.41 382.31 246.75 
 60 456.34 421.41 309.43 382.31 246.80 
120 456.33 421.40 309.46 382.31 246.87 
180 456.32 421.40 309.47 382.31 246.91 
240 456.32 421.40 309.49 382.31 246.95 
300 456.31 421.40 309.50 382.31 246.97 
360 456.31 421.40 309.50 382.31 246.99 
420 456.30 421.40 309.51 382.31 247.01 
480 456.30 421.39 309.51 382.31 247.02 
540 456.29 421.39 309.52 382.30 247.03 
600 456.29 421.39 309.52 382.30 247.04 

Units are in W/m²

(Trop=Tropics, MLS=Mid-Latitude Summer, MLW=Mid-Latitude Winter, SAS=Subarctic Summer, SAW=Subarctic Winter)

You see it ? ? ?

NASA’s tool also allows you to edit the atmospheric composition of water vapor, by setting Atmosphere EDIT to Detail.

I automated changes to sea level water vapor content while maintaining same CO2 level (410 ppm) and same zone (Mid-Latitude Summer). Result:

0.001 423.39 
0.002 423.39 
0.004 423.39 
0.010 423.39 
0.020 422.07 
0.040 421.78 
0.100 421.31 
0.200 421.13 
0.400 421.24

Anyway, that’s all the time I have for now. -Zoe

Update 02/08

While my analysis for CO2 is correct, it appears my H2O analysis was too simplistic. I have re-written the code. What I do use is use all 5 climate zones and change water vapor content in the whole atmospheric column, not just near the surface. I divide original content by 2, 4, 8, 16 and then multiply by same.

New Result:

  WV-X   Trop   MLS    MLW    SAS    SAW
0.0625X 455.63 420.61 308.94 381.72 246.23 
 0.125X 455.73 420.76 309.09 381.86 246.42 
  0.25X 455.83 420.91 309.24 381.98 246.61 
   0.5X 456.01 421.09 309.37 382.11 246.80 
     1X 456.30 421.40 309.51 382.31 247.01 
     2X 456.40 421.70 309.70 382.59 247.22 
     4X 456.02 421.64 309.95 382.65 247.48 
     8X 455.53 421.31 310.08 382.33 247.82 
    16X 455.41 421.12 309.91 381.96 248.11  

There is now warming in every zone but the tropics. No problem … The extra energy needed to raise water vapor content is exactly what these calculations perform. What you’re seeing is new raised fluxes needed to raise WV content.

Apologies if you feel the title of this article is now misleading. I strive for truth and accuracy.

Another Update 02/08 🙂

I updated the code to check every spectral type, not just the Ocean. The function h2o_diff tracks changes of the effects of multiplying water vapor by 256 times. Here is the result:

   Type  Trop   MLS    MLW    SAS    SAW
    01  -1.31  -0.59  -0.22  -0.93  +0.82
    02  -1.31  -0.59  -0.22  -0.93  +0.82
    03  +0.28  +0.85  +0.80  +0.39  +1.52
    04  +0.28  +0.85  +0.80  +0.39  +1.52
    05  -0.54  +0.10  +0.27  -0.29  +1.15
    06  +2.59  +2.98  +2.45  +2.42  +2.75
    07  +8.74  +8.61  +6.77  +7.76  +5.92
    08  -0.45  +0.17  +0.30  -0.24  +1.15
    09  -0.45  +0.17  +0.30  -0.24  +1.15
    10  -0.45  +0.17  +0.30  -0.24  +1.15
    11  -0.32  +0.36  +0.64  +0.01  +1.52
    12  -0.45  +0.17  +0.30  -0.24  +1.15
    13  -2.47  -1.63  -0.94  -1.90  +0.32
    14  -0.47  +0.16  +0.30  -0.24  +1.17
    15  -2.44  -1.60  -0.91  -1.86  +0.35
    16 +11.84 +11.45  +8.95 +10.45  +7.52
    17  -0.22  +0.51  +0.97  +0.24  +1.88

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

I did the same for CO2 (co2_diff):

   Type  Trop   MLS    MLW    SAS    SAW
    01  -0.09  -0.06  -0.06  -0.09  +0.10
    02  -0.09  -0.06  -0.06  -0.09  +0.10
    03  -0.07  -0.03  +0.00  -0.05  +0.15
    04  -0.07  -0.03  +0.00  -0.05  +0.15
    05  -0.07  -0.04  -0.03  -0.07  +0.13
    06  -0.04  +0.01  +0.11  +0.01  +0.30
    07  +0.02  +0.11  +0.41  +0.17  +0.63
    08  -0.08  -0.04  -0.05  -0.07  +0.11
    09  -0.08  -0.04  -0.05  -0.07  +0.11
    10  -0.08  -0.04  -0.05  -0.07  +0.11
    11  -0.07  -0.03  +0.05  -0.04  +0.23
    12  -0.08  -0.04  -0.05  -0.07  +0.11
    13  -0.10  -0.07  -0.10  -0.12  +0.06
    14  -0.07  -0.04  -0.03  -0.07  +0.12
    15  -0.10  -0.08  -0.09  -0.11  +0.07
    16  +0.06  +0.16  +0.55  +0.25  +0.80
    17  -0.07  -0.02  +0.13  -0.01  +0.33

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

Code rtransfer.sh:

# Zoe Phin, v2.2: 2021/02/08

url='https://cloudsgate2.larc.nasa.gov/cgi-bin/fuliou/runfl.cgi?CASE=A
&Compute=Compute&ID=014605%0D%0A&DOUT=F&FOUT=1
&SELOUT=OUTPUT_details
&ATM=mls.atm&EATM=No
&CZA=0.5&VZA=1.0
&STREAM=GWTSA&SFCALB=IGBP
&SFCTYPE=17
&FOAM=OFF&WIND=5.0
&CF3=0.0&CHL=0.1
&CF1=1.0&COD1=1.0&CLDTOP1=250&CLDBOT1=300&PHASE1=ICE&CLDPART1=60&CINH1=100
&CF2=0.0&COD2=10.0&CLDTOP2=850&CLDBOT2=900&PHASE2=WATER&CLDPART2=20&CINH2=100
&AOT1=0.20&AOTTYPE1=continental&AOTSH1=4
&AOT2=0.00&AOTTYPE2=0.5_dust_l2004&AOTSH2=1
&CONT=2.1_ckd&ELEV=0.0
&RES=HI
&CO2=X'

types() { echo '
     1 Evergreen Needle Forest   11 Wetlands
     2 Evergreen Broad Forest    12 Crops
     3 Deciduous Needle Forest   13 Urban
     4 Deciduous Broad Forest    14 Crop/Mosaic
     5 Mixed Forest              15 Permanent Snow
     6 Closed Shrub              16 Barren / Desert
     7 Open Shrub                17 Ocean
     8 Woody Savanna             18 Tundra
     9 Savanna                   19 Fresh Snow
    10 Grassland                 20 Sea Ice
    ' | tr -d '\t'
}

co2() {
    echo "PPM  Trop   MLS    MLW    SAS    SAW"
    for ppm in 15 30 60 120 180 240 300 360 420 480 540 600; do
        printf "%3d " $ppm
        for zone in trop mls mlw sas saw; do
            echo $url | sed "s/ //g; s/CO2=X/CO2=$ppm/; s/ATM=mls/ATM=$zone/" | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done 
}

co2_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        T=$(printf "%02d" $t)
        sed -n "/Type $T/,/^$/p" co2.csv | sed -n '3,14p' | cut -c4- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

h2o() {
	for atm in trop mls mlw sas saw; do
		echo $url | sed "s/ //g; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/" | wget -qO- -i- | sed -n '/<textarea /,/\/textarea>/p;' | sed '1d;$d' > $atm.prof
	done

    echo "  WV-X   Trop   MLS    MLW    SAS    SAW"
    for w in 0.0625 0.125 0.25 0.5 1 2 4 8 16; do
        printf "%6gX " $w
        for zone in trop mls mlw sas saw; do
            atmo=$(awk -vw=$w '{printf "%-7G %8.4f %13G %13G%0D%0A\n", $1, $2, $3*w, $4}' $zone.prof | tr ' ' '+')

            (echo $url | sed "s/ //g; s/CO2=X/CO2=410/; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/"; 
            echo "&ATMOSPHERE=$atmo") | tr -d '\n' | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done | tee h2o.csv
}

h2o_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        sed -n "/Type $t/,/^$/p" h2o.csv | sed -n '3,11p' | cut -c9- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

Run it:

$ . rtransfer.sh; co2
$ . rtransfer.sh; h2o


$ . rtransger.sh; co2_diff  # (must be run after co2)
$ . rtransger.sh; h2o_diff  # (must be run after h2o)

Automated Twit

I needed to create a throwaway Twitter account for a research project. I decided to automate its creation and share the code with you 🙂

I used the handy service of 10minutemail for receiving Twitter’s verification code.

I used the wonderful browser automation tool Nightmare.

Automation in action:

Patiently wait 10 seconds in the beginning and to receive Verification code

Code twit.sh:

# Zoe Phin, 2021/01/28

require() { sudo apt-get -y install npm && npm install nightmare; }

newtwit() { echo "nm = require('nightmare')
    main().catch(e=>{console.log('done.')})
    async function main() {
        e = nm({show: false})
        await e.goto('https://10minutemail.com').wait(2000)

        email = await e.evaluate( ()=> {return document.querySelector('input').value} )
        console.log(email)

        n = nm({show: true}).viewport(740,680)
        await n.goto('https://twitter.com/i/flow/signup').wait(6000)

        await n.insert('input[name=name]','Unique Snowflake')

        await n.evaluate( ()=> { 
            document.querySelectorAll('div[role=button]')[1].children[0].click() } )

        await n.insert('input[name=email]', email)

        await n.select('#Month','1').select('#Day','29').select('#Year','1999')

        await n.type('body','\t\t\t\t\t\t ')
        await n.type('body','\t\t\t \t\t\t ')
        await n.type('body','\t\t\t\t\t\t\t\t\t ')
 
        vcode = await e.wait(10000).evaluate( ()=> {
            return document.querySelector('div.small_subject').innerText.substr(0,6) })

        await n.insert('input', vcode).type('body','\t\t\t ')
        console.log(vcode)
        
        await n.wait(2000).type('input[name=password]', 'Un1qu3 Sn0wfl4k3!')
        await n.wait(1000).type('body','\t\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(2000).type('body','\t\t ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body','\t ')
    //	await n.wait(5000).end()
    } 
    " | node 
}

Setup NodeJS and Nightmare:

$ . twit.sh; require

Run:

$ . twit.sh; newtwit

Note: As I’m not an adept browser-bot coder this code may fail once or twice before working. Just run it until it does. Hopefully someone with more time can fix it. It’s good enough for me.

Enjoy 🙂 -Zoe

P.S. Did you notice Twitter is pushing POTUS as the most important person to follow?

White House Youtube Dislike Manipulation

I’ve seen screenshots of YouTube modifying dislikes of White House videos. I decided I would do a thorough analysis myself. I wrote a script to check video stats every 80 seconds for 24 hours – for all videos on White House’s YouTube channel.

The collected data is archived here and here. The format is space-separated “CSV”, as follows:

VideoURL UnixTimestamp Date,Time Views Likes Dislikes

Here is a sample of the most egregious manipulation:

Some videos were delisted in minutes!:

https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771197 01/27/2021,13:13:17      1227       437      2963
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771285 01/27/2021,13:14:45      1463       441      2999
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771372 01/27/2021,13:16:12      1763       449      3030
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771459 01/27/2021,13:17:39      2476       455      3060
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771546 01/27/2021,13:19:06      2640       459      3098
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771720 01/27/2021,13:22:00      3588       470      3183
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699362 01/26/2021,17:16:02       918       405      4942
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699448 01/26/2021,17:17:28      1202       412      4976
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699534 01/26/2021,17:18:54      1375       415      5026
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766646 01/27/2021,11:57:26       255       375      1771
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766733 01/27/2021,11:58:53       455       380      1823
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766819 01/27/2021,12:00:19       455       383      1852
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766906 01/27/2021,12:01:46       819       387      1886
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766992 01/27/2021,12:03:12      1148       393      1932
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767079 01/27/2021,12:04:39      1462       397      1971
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767166 01/27/2021,12:06:06      1830       398      2019
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770591 01/27/2021,13:03:11      1587        83      2040
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770764 01/27/2021,13:06:04      3014        95      2114

Likes+Dislikes was greater than views in some cases. Although that seems impossible, Youtube updates views slower, so they do not reflect real views at the time. For example:

https://www.youtube.com/watch?v=jw1_00uI02U 1611720090 01/26/2021,23:01:30     44404       924      8099
https://www.youtube.com/watch?v=jw1_00uI02U 1611720176 01/26/2021,23:02:56     44404       924      8118
https://www.youtube.com/watch?v=jw1_00uI02U 1611720260 01/26/2021,23:04:20     44404       925      8132
https://www.youtube.com/watch?v=jw1_00uI02U 1611720345 01/26/2021,23:05:45     44404       925      8151
https://www.youtube.com/watch?v=jw1_00uI02U 1611720429 01/26/2021,23:07:09     44404       925      8168
https://www.youtube.com/watch?v=jw1_00uI02U 1611720514 01/26/2021,23:08:34     44556       925      8184
https://www.youtube.com/watch?v=jw1_00uI02U 1611720599 01/26/2021,23:09:59     44556       925      8199
https://www.youtube.com/watch?v=jw1_00uI02U 1611720683 01/26/2021,23:11:23     44556       928      8219
https://www.youtube.com/watch?v=jw1_00uI02U 1611720768 01/26/2021,23:12:48     44556       928      8237

So it’s possible for likes and dislikes to accumulate while views stays the same. Eventually, views jumps up to better reflect reality.

The record of every time dislikes were removed is archived at https://pastebin.com/raw/F4ELDc4R

Grand Total         -130321

130 Thousand dislikes were removed in a 24hr period!

And this is for the most popular US President of all time!

Enjoy 🙂 -Zoe


Update

This research was featured in a youtube video.


Code wh.sh:

# Zoe Phin, 2021/01/26

require() { sudo apt-get install -y curl gnuplot; }

stats() {
    list=$(curl -s 'https://www.youtube.com/c/WhiteHouse/videos' | grep -o 'watch?v=[^"]*')
    for i in $list; do
        link="https://www.youtube.com/$i"
        date=$(date +"%s %x,%R:%S" | tr -d '\n')
        curl -s $link | tr -d ',' | tr '}' '\n' > new 
        grep -m 1 -o '[0-9,]* views' new > .views
        grep -m 1 -o '[0-9,]* likes' new > .likes
        grep -m 1 -o '[0-9,]* dislikes' new  > .dislikes

        paste .views .likes .dislikes | awk -vL=$link -vD="$date" '
            NF==6{printf "%s %s %9s %9s %9s\n", L, D, $1, $3, $5}'
    done
}

collect() {
    while true; do
        stats; sleep 75
    done | tee -a data.csv
}

dislikes() {
    list=$(cut -c1-44 data.csv | sort -u)

    for vid in $list; do	
        echo $vid
        grep ^$vid data.csv | awk '{
            DiffD=$6-D
            if (DiffD < 0) { 
                printf "%s %+7d\n", $3, DiffD 
                DLost+=DiffD
            }
            D=$6
        } END {
            printf "%-19s %7d\n", "Total", DLost
        }' 
        echo
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() {
    list=$(cut -c1-44 data.csv | sort -u)
    let n=0

    for vid in $list; do	
        let n++
        awk -vV=$vid '$1==V {print $2" "$4" "$5" "$6}' data.csv > plot.csv

        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title '$vid'
        set timefmt '%s'
        set xdata time
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Run:

$ source wh.sh; require

Collect data:

$ collect

( press Ctrl-C when you're done )

Record of dislike drops:

$ dislikes

Generate charts:

$ plot

Version 2.0: Cleaner and grabs stats at random 60 to 120 second interval.

# Zoe Phin, v2.0 - 2021/02/20

require() { sudo apt-get install -y gnuplot; }

collect() {
    url="https://www.youtube.com"
    while true; do
        for vid in $(wget -qO- "$url/c/WhiteHouse/videos" | grep -o 'watch?v=[^"]*'); do
            wget -qO- $url/$vid | egrep -o '[0-9,]* (views|likes|dislikes)' |\
            sed -n 1~2p | tr -d '[:alpha:],\n' |\
            awk -vL=$url/$vid -vD="$(date +"%s %x,%R:%S" | tr -d '\n')" '
                NF==3 { printf "%s %s %9s %9s %9s\n", L, D, $1, $2, $3 }'
        done
        sleep $(seq 60 120 | shuf | head -1)
    done | tee -a data.csv
}

dislikes() {
    for vid in $(cut -c1-44 data.csv | sort -u); do	
        awk -vv=$vid 'BEGIN { print v } $1==v { 
            Diff=$6-Last
            if (Diff < 0) printf "%s %+7d\n", $3, Lost+=Diff 
            Last=$6
        } END {
            printf "%-19s %7d\n\n", "Total", Lost
        }' data.csv
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() { n=0
    for vid in $(cut -c1-44 data.csv | sort -u); do	let n++
        awk -vv=$vid '$1==v {print $2" "$4" "$5" "$6}' data.csv > plot.csv
        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title noenhanced '$vid'
        set xdata time
        set timefmt '%s'
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Something Rotten in Georgia

Updated – 2021/01/06, 06:50 PM EST

The results of the Georgia runoff election results do not make any logical sense to me. In the last 2 months I have probably seen political ads over 1000 times! No exaggeration. All for the 2 senate seats. There was not a single ad for Public Service Commission District 4, and yet the Republican running for this seat got more votes than Republicans running for either senate seat:

You can also grab NY Times’ data using this Linux one-liner (assuming you have curl and jq installed):

$ curl -s https://static01.nyt.com/elections-assets/2020/data/api/2021-01-05/state-page/georgia.json | jq -Mc '.data.races[].candidates[]|[.votes,.last_name,.party_id]' | tr -d '["]'

The result matches official GA site:

2272277,Warnock,democrat
2189111,Loeffler,republican
2253203,Ossoff,democrat
2208129,Perdue,republican
2227870,McDonald,republican
2185670,Blackman,democrat

McDonald got ~15K more votes than Perdue, and ~38K more than Loeffler.

So the question is: How did this happen? How did Republicans manage to vote more for a less important race?

Do you really believe Republicans would vote more for Public Service Commission District 4 than two senate seats ???

No way!

It sure smells like fraud. As if ballots were thrown out … or switched to Democrats.

Also, the Democrat for Commission District 4 got less votes than other Democrats. As if many fake ballots were produced rapidly just for the senate seats, and perpetrators didn’t have time to fill in this seat.

How about November 2020 senate election data? I combine both senate races:

$ curl -s 'https://static01.nyt.com/elections-assets/2020/data/api/2020-11-03/state-page/georgia.json' | jq -Mc '.data.races[1,2].candidates[]|[.party_id,.last_name,.votes]' | tr -d '["]' | tee november.csv

republican,Perdue,2462617
democrat,Ossoff,2374519
libertarian,Hazel,115039
write-ins,Write-ins,265
democrat,Warnock,1617035
republican,Loeffler,1273214
republican,Collins,980454
democrat,Jackson,324118
democrat,Lieberman,136021
democrat,Johnson-Shealey,106767
democrat,James,94406
republican,Grayson,51592
democrat,Slade,44945
republican,Jackson,44335
republican,Taylor,40349
republican,Johnson,36176
libertarian,Slowinski,35431
democrat,Winfield,28687
democrat,Tarver,26333
independent,Buckley,17954
green,Fortuin,15293
independent,Bartell,14640
independent,Stovall,13318
independent,Greene,13293
write-ins,Write-ins,34

I open the results in Excel, and try to combine data into Left and Right candidates. Republican and Libertarian is obviously Right. Democrat and Green are obviously Left. I give the Left a huge boost by including Independents with the Left, and despite this …

The Right Wins. What’s surprising is that the Right lost both Senate Runoff Elections. Why?

Fraud!

Something is very rotten in the state of Georgia!

-Zoe

If you missed it: https://phzoe.com/2021/01/06/georgia-senate-runoff-timestamp-data/

Georgia Senate Runoff Timestamp Data

I was trying to find timestamp data for the Georgia 2021 Senate Runoff election. I couldn’t find it easily via a google search, but I kept digging, and managed to find it and extract it from NY Times’ live prediction data feed. Here it is …

Senate Race 1
Senate Race 2

Code … georgia.sh:

# Zoe Phin, 2021/01/06

require() { sudo apt-get -y install curl jq gnuplot; }

download() { 
    curl -o ga1.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-G-S-2021-01-05.json
    curl -o ga2.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-S-S-2021-01-05.json
}

timeseries() {
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga1.json | tr -d '["]' > ga1.csv
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga2.json | tr -d '["]' > ga2.csv
}

format() {
    for i in 1 2; do
        (echo "Timestamp            Votes  Rep %   Dem %   Rep     Dem"
         awk -F, '{ "TZ=US/Eastern date +%x,%R:%S -d "$1 | getline t; printf "%s %7d %6.4f %6.4f %7d %7d\n", t, $2, $3, $4, $2*$3, $2*$4 }' ga$i.csv
        ) > ga$i.txt
    done
}

plot() {
    awk -F, '{ "TZ=US/Eastern date +%d%H%M%S -d "$1 | getline t; printf "%s %7d %7d\n", t, $2*$3, $2*$4 }' ga$1.csv > ga$1.dat
    (echo 'set term png size 640,480
    set key top left
    set grid xtics ytics
    set ylabel "Million Votes"
    set timefmt "%d%H%M%S"
    set xdata time
    set xtics format "01/%d\n%H:%M"
    set ytics format "%.1f"
    set mytics 5'
    echo "plot 'ga${1}.dat' u 1:(\$2/1e6) t '$2' w lines lc rgb 'red','' u 1:(\$3/1e6) t '$3' w lines lc rgb 'blue'"
    ) | gnuplot > ga$1.png
}

Run it:

$ source georgia.sh; require; download; timeseries; format

format generates timestamp data into two files: ga1.txt and ga2.txt. The results are archived here and here, respectively.

Race 1 is Perdue vs. Ossoff, and Race 2 is Loeffler vs. Warnock

To plot the data:

$ plot 1 Perdue Ossoff
$ plot 2 Loeffler Warnock

This generates ga1.png and ga2.png, which I present above.

I left my opinion out of this post. Curious Windows coders should follow instructions here.

Enjoy the data 🙂 -Zoe

Heat flux in the Sun

The sun is known to emit ~63 Mega Watts per meter squared from its photosphere. But what is the heat flux inside this emissive photosphere?

Source

Heat flux formula: q = k*ΔT/L

q = k * (6600-4400 Kelvin) / (500,000 meters)

What is the thermal conductivity (k) value of hydrogen at these temperatures? [1]

This is actually very difficult to find, but I managed to find something:

Thermal Conductivity of Hydrogen, Source, Figure 5

This y-axis needs to be divided by 10 to get units (W/m*K).

The range of pressure in the photosphere is: 0.0009 atm to 0.123 atm. I think it’s safe to say that thermal conductivity of hydrogen is definitely no more than 2.5 W/m*K in our desired range. That will be our upper limit. Thus,

q = 2.5 * 2200 / 500000 = 0.011 W/m² [2]

As you can you can see, there is no problem with 0.011 W/m² “supporting” a 63 MW/m² output.

My critics will be quick to point out that I can’t use the conduction formula because the sun only has radiative transfers in the photosphere. But that’s just their excuse for NEVER figuring out what the internal heat flow is. Any of their attempts at doing so will be embarrassing for them, and so they will avoid it at all cost. Surely there is a heat flux, and surely it can be figured out.

My critics believe in conservation of heat flow, that is: internal heat flux => emergent radiation. There must be equality of these two things, otherwise there will be rapid cooling. Well, the sun has had 4.5 billion years to reach their “steady-state” equilibrium nonsense and it’s nowhere close. Maybe despite all their chest thumping, they have no idea what they’re talking about?

What goes for the sun here goes for my geothermal theory as well.

Just as <0.011 W/m² internal heat flux can “support” a 63 MW/m² emission from the sun, so too can a ~0.9 W/m² geothermal heat flux “support” a ~334 W/m² emission from Earth’s surface.

And why is that? See here and here.

Think about it!

Enjoy 🙂 -Zoe

Note:

[1] I left out helium. I don’t care to include it, as it makes little difference. I only care about being right within an order of magnitude.

[2] I don’t include surface area of emission, because the difference in solar radius of top and bottom of photosphere is too small.

Geothermal Denial

Climate “scientists” look at Earth’s geothermal heat flux, see that it’s small (~0.1 W/m²), and then conclude geothermal can’t possibly predominantly explain Earth’s surface temperature. This is plain wrong. I came up with an illustration to demonstrate my point a while ago:

The Heat Flux Fallacy

This is a fictional planet. I did this on purpose to accentuate my point. The question here is: Why is the surface 1000°C?

Mainstream climate “scientists” would see the small geothermal heat flux (0.1 W/m²), ignore it, and conclude it must be because Solar + Atmosphere delivers 148,971 W/m² to the surface. It couldn’t have anything to do with the 1010°C a hundred meters below the surface. Oh no, it can’t be that!

But did I say this fantasy planet even has an atmosphere? What if not? The sun only delivers 165 W/m² … Where’s the rest of the energy coming from to make the surface 1000°C?

And if there is an atmosphere … where did the atmosphere get 148,806 W/m² to give to the surface?

Climate cranks come to the rescue and claim that due to infrared absorbing gases in the atmosphere, the Sun’s 165 W/m² gets auto-magically boosted to 148,971 W/m², because that 165 W/m² can’t escape to space. You know what I have to say to that?

The Greenhouse Effect

Some people have serious problems accepting the truth: The surface here is 1000°C because it’s 1010°C a hundred meters below the surface. Simple.

Now why is that so hard to accept? Ideological presupposition. That’s why!

Let’s take a look at a recent discussion here:

Geothermal heat is about 0.1 W/m². Solar absorption is around 161 W/m². All solar is lost on a regular basis and heat loss by the surface is very (!) dynamic. Which means that a little bit more energy from below (for example 0.1 W/m2) is easily lost, together with the dynamic 1610 times higher ‘standard heat loss’.

Wim Röst

I said that her claim, that the heat flux leaving the ground was thousands of times larger than the heat flux passing through the ground, was physically impossible.

Willis Eschenbach

So what they are both saying in our context is that a 0.1 W/m² geothermal heat flux can’t “support” a 148,971 W/m² emission from the surface, so I must be wrong!

In Willis’ view, there must be equality between geothermal heat flux and surface emission.

He believes this is needed to preserve conservation of energy. But what is he really doing?

He’s equating a heat flux between two locations to an absolute energy flux equivalent at one location. Is that conservation of energy? No!

Energy is energy, and heat flux is the energy transfer from hot to cold, i.e. a DIFFERENCE of energies at two locations. How can you compare a differential to an absolute? It makes no sense. But don’t believe me, check your own eyes:

The top of the water represents the planet’s surface.

If what Willis et al were saying was true, we should expect a steep thermal gradient from the bottom to the top of the water column – so that conductive heat flux equals emergent radiation. But what actually happens?

As you can see, the top and bottom of the water column becomes the same temperature. In other words, the conductive (“geothermal”) heat flux becomes 0 W/m².

The top of this water column is capable of emitting εσT⁴, or (5.67e-8)*(273.15+83)^4 =~ 912 W/m²

In this case, 0 W/m² has no problem “supporting” 912 W/m² ! That’s infinite times!

So why can’t a 0.1 W/m² geothermal heat flux “support” a 148,971 W/m² emergent flux?

Of course it can. It’s not a problem at all. These people are simply confused on the physics.

Now imagine they had to explain this infrared electric kettle hot water video without mentioning the heat source below. How would they do that? Well…

The top of the kettle (“sun”) emits 20°C worth of radiation to the top of the water (419 W/m²) . The water vapor and carbon dioxide in the air above the water prevents radiation from leaving to colder space, and so auto-magically the top of the water becomes 912 W/m², or 83°C. Simple!

Now do you see it?

Greenhouse Effect == Geothermal Denial.

Note: Real planetary subsurface has a density gradient, and so you will see a small geothermal heat flux. There is no density gradient in this water example, so the conductive heat flux goes to zero.


Now we take my argument down to Earth, literally. Why is Earth’s surface ~15°C?

Because geothermal “delivers” 0°C to the surface. Add insolation and subtract latent and sensible heat, and you get your 15°C. Simple. No Greenhouse scam necessary.

More details in other articles, such as here, here, and here.

Enjoy 🙂 -Zoe

Happy New Year, Everybody!

Ozone Hole Watch

NASA provides extensive ozone data. I wanted to see what it looks like over time. The following charts describe the following:

The ozone hole area is determined from total ozone satellite measurements. It is defined to be that region of ozone values below 220 Dobson Units (DU) located south of 40°S.

Link
Daily
Annual Mean

Note: 1995 and most of 1996 is missing from their data. I delete 1996 entirely for annual average.

Code: ozone.sh

# Zoe Phin
# 2020/12/22

require() { sudo apt-get install curl gnuplot; }

download() { for y in {1979..2020}; do
    curl -o y$y.csv "https://ozonewatch.gsfc.nasa.gov/meteorology/figures/ozone/to3areas_${y}_toms+omi+omps.txt"
done; }

daily() { awk '
    BEGINFILE { d=0 }
    FNR>6 && $2!~/9999/ {
        y=substr($1,1,4); d+=1/366
        printf "%8.3f %s\n", y+d, $2
    }' y*.csv > daily.csv
}

# Run daily() first
annual() { cut -c 1-4,9- daily.csv | awk '{
        S[$1]+=$2; N[$1]+=1 
    } END {
        for (y in S) print y" "S[y]/N[y]
    }' | sed '/^1996/d' > annual.csv
}

# Arg 1 - 'daily' or 'annual'
plot() { echo "set term png size 740,370
    set nokey 
    set title 'Ozone Hole Area (mil. km²)'
    set xrange [1979:2021]
    set mxtics 5
    set grid ytics
    plot '$1.csv' u 1:2 w l lw 1 lc rgb 'blue'
    " | gnuplot > $1.png
}

Run it:

$ source ozone.sh
$ require && download
$ daily && annual
$ plot daily
$ plot annual

Enjoy 🙂 -Zoe

US Deaths

There’s some articles claiming that US deaths from all causes is lower this year than previous years. Is it true?

No, it is not true. Here’s my result:

Year Pop.      Deaths  %
2011 311556874 2512442 0.806%
2012 313830990 2501531 0.797%
2013 315993715 2608019 0.825%
2014 318301008 2582448 0.811%
2015 320635163 2699826 0.842%
2016 322941311 2703215 0.837%
2017 324985539 2788163 0.858%
2018 326687501 2824382 0.865%
2019 328239523 2835038 0.864%
2020 331937300 3290723 0.991%

Code… uscovid.sh:

# Zoe Phin
# 2020/12/15

download() {
    wget -O covid.csv -c 'https://data.cdc.gov/api/views/xkkf-xrst/rows.csv?accessType=DOWNLOAD&bom=true&format=true%20target='
    wget -O popul.csv -c 'https://www2.census.gov/programs-surveys/popest/datasets/2010-2019/national/totals/nst-est2019-alldata.csv'
}

usdeaths() {
    echo 'Year Pop.      Deaths  %'
    (sed -n 2p popul.csv | tr ',' '\n' | sed -n 9,17p > .pop
    sed -n 2p popul.csv | tr ',' '\n' | sed -n 39,47p > .death
    for y in {2011..2019}; do echo $y; done | paste -d ' ' - .pop .death
    awk -F, 'NR<9999 && $2~/States/ && substr($1,1,4)==2020 { 
        S+=$3; "date +%j -d "$1 | getline D } END { 
        printf "2020 331002651 %d\n",366/D*S }' covid.csv
    ) | awk '{
        printf "%s %5.3f%\n", $0, $3/$2*100
    }'
}

Run it:

$ source uscovid.sh && download && usdeaths

Note 1: As of writing this, the current year has 340 days of data. What I did was multiply the death count by 366/340.

Note 2: US 2020 population estimate number came from here. It will already be obsolete by the time you read this. Expect a little error in %.

Solar Spectrum

I wrote some code to generate a solar spectrum chart from official data. Thought I’d share it with you …

ssi.sh:

# Zoe Phin
# 2020/12/20

require() {
    sudo apt-get install curl nco gnuplot
}

download() {
    curl -o ssi.nc https://www.ncei.noaa.gov/data/solar-spectral-irradiance/access/daily/ssi_v02r01_daily_s20190101_e20191231_c20200226.nc
}

yearavg() {
    ncks --trd -HC -v SSI ssi.nc | awk -F '[= ]' '
        { D[$4]+=$6 } END { for (d in D) printf "%.4f %11.9f\n", d/1000, D[d]/365*1000 }
    ' | sort -n | sed 1d | awk '
        BEGIN { pi=atan2(0,-1); c=299792458; h=6.62607015E-34; k=1.38106485E-23 
            r=695700000; d=149597870700 }
        function P(T,w) { return ((2*pi*h*c^2/(w/1e6)^5)/(exp(h*c/(k*T*(w/1e6)))-1))/1e6 }
        { printf "%s %.9f\n", $0, P(5772,$1)*(r/d)^2 }
    ' > ssi.csv
}

plot() {
    echo 'set term png size 740,370
        set xrange [0:2.5]
        set mxtics 5
        set grid xtics ytics
        set ylabel "Radiance (W/m²/μm)"
        set xlabel "Wavelength (μm)"

        plot "ssi.csv" u 1:2 t "Solar Spectrum" w l lw 1 lc rgb "orange",\
             "" u 1:3 t "5772K Blackbody" w l lw 1 lc rgb "black"
    ' | gnuplot > solar.png
}

Run it:

$ source ssi.sh
$ require
$ download
$ yearavg
$ plot

or

$ . ssi.sh; require && download && yearavg && plot

Data and image contained in ssi.csv and solar.png

Enjoy 🙂 and Happy Holidays! -Zoe

Update

I also made a version for nanometer wavelenths rather than microns …

ssi-nm.sh:

# Zoe Phin
# 2020/12/22

require() {
    sudo apt-get install curl nco gnuplot
}

download() {
    curl -o ssi.nc https://www.ncei.noaa.gov/data/solar-spectral-irradiance/access/daily/ssi_v02r01_daily_s20190101_e20191231_c20200226.nc
}

yearavg() {
    ncks --trd -HC -v SSI ssi.nc | awk -F '[= ]' '
        { D[$4]+=$6 } END { for (d in D) printf "%.4f %11.9f\n", d, D[d]/365 }
    ' | sort -n | sed 1d | awk '
        BEGIN { pi=atan2(0,-1); c=299792458; h=6.62607015E-34; k=1.38106485E-23 
            r=695700000; d=149597870700 }
        function P(T,w) { return ((2*pi*h*c^2/(w/1e9)^5)/(exp(h*c/(k*T*(w/1e9)))-1))/1e9 }
        { printf "%s %.9f\n", $0, P(5772,$1)*(r/d)^2 }
    ' > ssi.csv
}

plot() {
    echo 'set term png size 740,370
        set xrange [0:2500]
        set xtics 200; set mxtics 2
        set ytics 0.2; set mytics 2
        set format y "%.1f"
        set grid xtics ytics
        set ylabel "Radiance (W/m²/nm)"
        set xlabel "Wavelength (nm)"

        plot "ssi.csv" u 1:2 t "Solar Spectrum" w l lw 1 lc rgb "orange",\
             "" u 1:3 t "5772K Blackbody" w l lw 1 lc rgb "black"
    ' | gnuplot > ssi-nm.png
}

Run:

$ . ssi-nm.sh; require && download && yearavg && plot

Climate Scientists vs Air Force

MODTRAN is a tool developed by US Air Force and Spectral Science, Inc to model absorption in the atmosphere. A free version is available from University of Chicago here.


Let’s start playing with this tool. We set all atmospheric gases and other parameters to zero.

CO2 = 0 ppm

The ground temperature doesn’t change – contra the opinion of mainstream climate “scientists”. Now we set carbon dioxide to 99.9999%:

CO2 = 999999 ppm

The ground temperature doesn’t change – contra the opinion of mainstream climate “scientists”.


We look at temperature height profiles for both minimum and maximum carbon dioxide concentration:

CO2 = 0 ppm
CO2 = 999999 ppm

The temperature at various heights doesn’t change – contra the opinion of mainstream climate “scientists”.

Now we look at absorption with various levels of CO2:


CO2 = 0 ppm
CO2 = 999999 ppm
CO2 = 410 ppm
CO2 = 820 ppm

While absorption obviously changes, surface (boundary) temperature doesn’t change – contra the opinion of mainstream climate “scientists”.

Now we add 100 degrees to the surface temperature:

Ground Temperature = 388 K, CO2 = 410 ppm
Ground Temperature = 388 K, CO2 = 410 ppm

The absorption factor and transmittance remains the same (~255.7 CM-1, 0.8836).

Absorption factor doesn’t change based on temperature, and we’ve already seen absorption doesn’t change temperature – contra the opinion of mainstream climate “scientists”.

Play with UChicago MODTRAN yourself, and see that I’m correct.

Enjoy 🙂 -Zoe

Scraping 2020 US Election Data

I thought this code could be useful for researchers. I wrote it a while ago, then never used it. Too busy 😦

elec.sh:

# Zoe Phin
# 2020/11/11

require() {
   sudo apt-get install curl jq gnuplot
}

download() {
   curl -O 'https://static01.nyt.com/elections-assets/2020/data/api/2020-11-03/state-page/{alabama,alaska,arizona,arkansas,california,colorado,connecticut,delaware,florida,georgia,hawaii,idaho,illinois,indiana,iowa,kansas,kentucky,louisiana,maine,maryland,massachusetts,michigan,minnesota,mississippi,missouri,montana,nebraska,nevada,new-hampshire,new-jersey,new-mexico,new-york,north-carolina,north-dakota,ohio,oklahoma,oregon,pennsylvania,rhode-island,south-carolina,south-dakota,tennessee,texas,utah,vermont,virginia,washington,west-virginia,wisconsin,wyoming}.json'
}

timelines() {
   for state in `ls *.json | sed s/.json//`; do
      jq -Mc '.data.races[0].timeseries[]|[.timestamp,.votes,.votes*.vote_shares.trumpd,.votes*.vote_shares.bidenj]' $state.json | tr -d '["]' |\
      awk -F, '{"TZ=US/Eastern date +%d%H%M%S -d "$1 | getline t; printf "%s %8d %8d %8d\n",t,$2-vlast,$3-tlast,$4-blast; vlast=$2; tlast=$3; blast=$4}' > $state.csv
   done
}

merged() {
   sort -n *.csv | awk '{vtotal+=$2; ttotal+=$3; btotal+=$4; printf "%s %9d %9d %9d\n",$1,vtotal,ttotal,btotal}' > elec.dat
}

plot() {
   echo 'set term png size 740,740
   set key top left 
   set grid xtics ytics
   set ylabel "Million Votes"
   set timefmt "%d%H%M%S"
   set xdata time
   set xtics format "11/%d\n%H:%M"
   set ytics format "%.0f"
   set xrange ["03190000":"04040000"]
   plot "elec.dat" u 1:($3/1e6) t "Trump" w lines lc rgb "red" lw 1,\
                "" u 1:($4/1e6) t "Biden" w lines lc rgb "blue" lw 1
   ' | gnuplot > elec.png
}

Run it:

$ source elec.sh
$ require && download
$ timelines && merged && plot

Result: elec.png

Enjoy 🙂 -Zoe

Update 2020/12/06

County level data can be obtained with this code snippet:

counties() {
   echo " County              |    Votes |    Votes    Votes    Votes |      %          %         %   "
   echo " Name                |    Total |    Biden    Trump    Jorg. |    Biden      Trump     Jorg."
   for state in `ls *.json | sed s/.json//`; do
      echo -e "\n--- $state ---\n"
      jq -Mc '.data.races[0].counties[]|[.name,.votes,.results.bidenj,.results.trumpd,.results.jorgensenj]' $state.json | tr -d '["]' |\
      awk -F, -vs=$state '$2!=0{ v+=$2; b+=$3; t+=$4; j+=$5;
         printf "%-20s | %8s | %8s %8s %8s | %9.3f %9.3f %9.3f\n",$1,$2,$3,$4,$5,$3/$2*100,$4/$2*100,$5/$2*100} END {
         printf "\nTotal                | %8s | %8s %8s %8s | %9.3f %9.3f %9.3f\n",v,b,t,j,b/v*100,t/v*100,j/v*100
      }'
   done
}

Run it:

$ . elec.sh; counties > counties.txt

Contents of counties.txt is archived here.

Geothermal to the Moon!

The Earth and the Moon have been neighbors for a very long time. We should expect long term steady state heat transfer from Earth’s internal energy (without Sun) to the Moon to be equal to Earth’s geothermal heat flux. To my knowledge, no one has ever made this claim. Sound outrageous? Let’s see…

According to [Davies 2010], the geothermal flux is 46.7 TW:

We conclude by discussing our preferred estimate of 47 TW, (rounded from 46.7 TW given that our error estimate is ± 2 TW)

— [Davies 2010]

When we divide this figure by the surface area of the Earth, we get:

(46700000000000 ± 2000000000000)/510065728777855 = 0.09155683 ± 0.0039 W/m²

This is not an emergent radiative flux, but a conductive heat flux.

I will now argue that this number is not accidental, but is based on heat transfer from the Earth to the Moon.

In my article Measuring Geothermal …, I extracted 335.643 W/m² as an emergent flux equivalent from geothermal.

In my article Deducing Geothermal, I deduced 332.46 W/m² as an emergent flux equivalent from geothermal.

Let’s average those two values to get: 334.05 W/m²

Now we apply the inverse square law for distant radiation, using data from NASA:

334.05 W/m² × (6371 km/ 378000 km)² = 0.09489495 W/m²

This value falls within the error range of measured geothermal flux. Do you think this is a coincidence? I think not. It makes perfect sense.

Enjoy 🙂 -Zoe

The Steel Greenhouse Ruse

Amateur scientist Willis Eschenbach developed a thought experiment to demonstrate how the greenhouse effect “works”:

It’s been refuted many times before, but I’ll make it even simpler.

The main claim is that the outer shell’s presence will force the inner core to warm up and radiate twice as much compared to no shell at all.

We start with 235 W/m² emerging from core and going to shell. I’ll use an inner and outer surface area of 1, and consider what is going on every second to make things simple.

235 Joules emerges from the core in the 1st second and goes to the shell . Willis reminds us:

In order to maintain its thermal equilibrium, the whole system must still radiate 235 W/m² out to space.

However … the first thing Willis does is break this rule, and sends 235 J back to the core. Nothing to space.

Now a new 235 J emerges in the 2nd second from the core which gets added with the 235 J that’s coming back from the shell from the 1st second.

The core now sends 470 J to shell. This 470 J now gets split into 235 J back to core and 245 J to space.

Second 3 and on just repeats. So you see what he did there? By violating a rule, he gets to cycle in an extra 235 J every second.

There’s a more accurate variation where the rule is violated several times but with less offending Joules each cycle. It goes like this:

Second 1: 235 core->shell, 117.5 shell->core, 117.5 -> space
Second 2: 352.5 core->shell, 176.25 shell->core, 176.25 -> space
Second 3: 411.25 ... 205.625 ... 205.625
Second 4: 440.625 ... 220.3125 ... 220.3125
Second 5: 455.3125 ... 227.65625 ... 227.65625
...
Second X: 470 ... 235 ... 235

I think you get the idea. I wrote a program to do all this, here. The first variation is easier to describe. Here’s some fun satire to illustrate the main point:


Imagine you’re the head manager of a sugar factory.

Every minute, a bag filled with 235 grams of sugar slides down a chute and lands in a basket. You take this bag and walk across the factory to place it inside a truck for later delivery.

You’ve been trying to figure out how to cheaply increase your production for a while now, and one day you finally got a great idea …

You decide to place a table at a halfway point between basket and truck.

In the first minute of implementing your great idea, you move the bag from basket to table. You decide not to then carry the bag to the truck, but back to the basket. You drop the bag in the basket a second before a new bag comes down the chute. When that new bag drops in the basket, and you see two bags, you say to yourself: “I’m a genius! I just doubled production!”.

You now carry two bags to the table. Then one bag to the truck, and one bag back to the basket. You then repeat this over and over.

You convince yourself that seeing two bags in the basket and carrying it to the table means that you’ve doubled production. The proof is self-evident. Congratulations!

Unfortunately not everyone agreed with you. Many thought you are crazy. So you fired them and hired those that agreed with you. You wanted consensus, and you got it!


Now I’m going to illustrate the greenhouse gas fallacy in the most primitive way, using only 2 water molecules:

Core
Shell

We’re at second 0, before any greenhouse magic begins, so the shell is still at 0 J, but the nuclear core is at 235 J. The intensity of motion represents the amount of energy present.

Energy is in fact motion. The universe has only two things: things and motion of them. I’m excluding space.

Willis (and all greenhouse gas junkies in general) believe that energy is just like matter, and you can pass it back to where it came from to have more of it.

What Willis et al end up doing is adding motion to existing motion to intensify motion. They believe this is science, but it’s actually a false philosophy.

Philosophy – core vibrates at twice the intensity

Core
Shell

In actual science, we know that the max energy into a system is the max energy THROUGHOUT the system. But in Willis’ philosophy, you can create a feedback loop that causes more energy (motion) somewhere in the system, but it’s all fine as long as just the final output (to space) obeys conservation of energy in regard to original input. This is completely false. Conservation of energy must be followed at every boundary.

Science – shell achieves vibrational resonance with core

Core
Shell

In reality, the shell will just come to resonate with the core. There will never be a molecule that vibrates more intensely than what the original energy supplied into the system allows.

This is all just 220 year old basic science. Hopefully, climate scientists might learn some basic experimental thermodynamics rather than relying on a falsified thought experiment.

Summary: You can’t make something vibrate more vigorously by confining it with another thing vibrating at an equal or lower rate.

Enjoy 🙂 -Zoe

Fourier’s Accidental Confession

Fourier is considered a direct predecessor to mainstream climatology. Mainstream climatology follows him and purposefully neglects geothermal energy in Earth’s energy budget due to the belief that it is too small. This then allows them to make the outrageous claim that it is IR-absorbing gases in the atmosphere that boosts surface temperatures to what we measure with thermometers.

So is it true that geothermal is negligible?

According to Fourier’s translated 1827 paper:

The effect of the primitive heat which the globe has retained has therefore
become essentially imperceptible at the Earth’s surface …

the effect of the interior heat is no longer perceptible at the surface of the Earth

– Temperatures of the Terrestrial Sphere, Page 15

Well that looks settled. Doesn’t it? Let’s see the whole context:

Temperatures of the Terrestrial Sphere, Page 15

This is a very curious paragraph, for it admits too much.

The only way to melt ice is to provide at least 0°C worth of energy. Right?

0°C is not “negligible”, now is it?

I can already hear my critics saying: “But Zoe, he said over a century!”

Sure. It’s so marginally over 0°C, that it takes a century to melt 3 cubic meters of ice. So what? It’s still at least 0°C. And it’s coming from the Earth.

Fourier contradicts himself when he claims Earth’s internal heat is imperceptible. Is ice melting not perceptible? What if he chose dry ice? More perceptible. What about nitrogen or oxygen “ice”? Even more perceptible!

Is 0°C correct? What do modern geophysicists think?

https://www.routledgehandbooks.com/doi/10.1201/9781315371436-4

Same thing! 0°C is still the convention.

The radiative equivalent of 0°C at emissivity=1 is 315.6 W/m²

Can this really be excluded from the energy budget? No.

What’s the significance of this?

It means the greenhouse effect is junk science. The surface has enough energy from geothermal and solar to explain surface temperatures.

I have two previous articles describing how the geothermal contribution can be computed more accurately using two different methods:

https://phzoe.com/2020/02/13/measuring-geothermal-a-revolutionary-hypothesis/

https://phzoe.com/2020/02/25/deducing-geothermal/

It’s nice to know that the geothermal hypothesis was accidently scientifically supported by the very guy that unfortunately rejected it. A guy who modern academics follow uncritically. The answer was right beneath his feet, but unfortunately his head was in the clouds. Because of him, modern academics truly believe that it is the atmosphere that provides raw energy to the surface, rather than geothermal. What a colossal mistake. They flipped reality completely upside down.

While my critics like to claim that geothermal can only provide ~36 Kelvin because they applied Stefan-Boltzmann formula to the small conductive heat flux of 91.6 mW/m², actual scientists know that geothermal can melt ice. And this knowledge is 200 years old! When are climate scientists going to wake up?

-Zoe

Update 10/02/2020

My critics point out that Fourier meant to add that 318 mW/m² over a course of a century; 3 centuries by today’s known geothermal heat flux: 91 mW/m².

That’s not the point. The point was to expose Fourier’s own confusion over the difference between heat and energy. Fourier’s conduction formula applies to HEAT flow, not energy. 318 mW/m² or 91 mW/m² of total emissive energy will NEVER melt ice. But 318 or 91 mW/m² of HEAT flow might, depending on the temperature the ice is sitting on.

Bottom line: Did Fourier claim geothermal could melt ice? YES. Did he give a good explanation? NO.

Is Fourier a good choice to be a father of climate science? That’s a big NO.

But … since Fourier claimed geothermal could melt ice, I will take his word for it, because in this case he is absolutely right.

COVID19 in Georgia

Today I analyze COVID19 data for my home state of Georgia. I thought it would be interesting because there is an anomaly. Let’s see the anamoly:

Cases per 100K (Source)
Population Density

You see it? The largest density of cases does not match the largest density of population. We would expect most cases per 100K to be in the 9th largest metropolis in the US (Atlanta), but it’s not!

How could this be? What could cause such an anomaly?

It might have something to do with foreign labor? Georgia is the 2nd largest recipient of temporary agricultural H-2A visas in 2019 (Source). Trend:

There’s no data as to which counties migrant workers go to, but we can take a logical leap: The most agriculturally productive counties probably have the most migrant workers.

We would expect those counties with the largest share of agriculture to be those disproportionately affected by COVID19. Let’s see …

Corn, 2019
Cotton, 2019
Peanuts, 2019
Corn, 2018
Cotton, 2018
Peanuts, 2018

It’s not a perfect match, but I think there’s something to it. Maybe I am wrong, but I haven’t found a better explanation from my local media. In fact, the issue was not even addressed by anyone.

Other states also have low density counties with high COVID19 densities, but they seldom surpass the rates in their major metro areas. Georgia is anomalous in this regard.

Thoughts? Comments?

Peace, -Zoe

CO2 Versus Global COVID19 Response

With the global economic response to the COVID19 epidemic, we would expect global CO2 to be rising much less than other years, if the theory of man-made global warming is indeed true.

I use data from NOAA to see what’s going on.

The estimated daily global seasonal cycle and trend value for CO2 are determined from the daily averaged CO2 data from the four NOAA/ESRL/GMD Baseline observatories. A smoothed seasonal cycle and a smoothed de-seasonalized trend curve are determined for each observatory record at daily intervals. An estimated global seasonal cycle and trend are computed by averaging the four individual observatory seasonal cycle and trend curves at each daily interval.

— ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_trend_gl.txt

I chose the most official processed data there is, so I can’t be accused of cherrypicking. What I do is compare May 1st to Jan 1st of every year from 2010 to 2020. Results:

2010 1.96 0.90
2011 0.95 0.58
2012 1.50 0.68
2013 2.25 0.98
2014 1.54 0.59
2015 1.87 0.72
2016 2.34 1.20
2017 1.64 0.67
2018 1.93 0.78
2019 1.88 0.86
2020 2.07 0.96

Results are in increased ppm (parts per million). 2nd column is smoothed seasonal cycle. 3rd column is smoothed de-seasonalized trend curve.

As you can see, 2020 was the 3rd largest increasing year, after 2016 and 2013.

We would expect it to come in last. Looks like nature doesn’t respond that quickly … or at all.

Peace, -Zoe

Update 06/05/2020

# For Jan 1 to Jun 4

2010 1.55 1.16
2011 0.56 0.72
2012 0.95 0.90
2013 1.99 1.23
2014 1.43 0.76
2015 1.31 0.95
2016 1.89 1.53
2017 1.33 0.84
2018 1.71 1.00
2019 1.58 1.12
2020 1.53 1.18

Code

co2.sh:

wget -qO co2.txt -c ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_trend_gl.txt      
awk '!/#/ && $2==1 && $3==1 { print $1" "$4" "$5 }' co2.txt > .start       
# Change $2 and $3 to Month and Day: (Ex: $2==5 && $3==1 for May 1st )
awk '!/#/ && $2==5 && $3==1 { print $4" "$5 }' co2.txt > .end 
paste .start .end | awk '{printf "%s %.2f %.2f\n", $1, $4-$2, $5-$3}'
rm -f .start .end

Run it:

$ bash co2.sh

The Irrelevance of Geothermal Heat Flux

You’ve probably heard it before: the geothermal heat flux is so small (91.6 mW/m²) that it can be effectively ignored in Earth’s energy budget. The first part is true, the heat flux is small, but this fact is completely irrelevant. And what is relevant is popularly denied and masked as something else.

I’ve already explained the problem here and here. Unfortunately not everyone understood the point I was trying to make, so I made a visualization:

Various Profiles with the same Geothermal Heat Flux (CF). Emissivity=1

CF (Conductive Flux) is the Geothermal Heat Flux, EF is the Emergent Geothermal Flux, Th and Tc are the temperatures of the hot side and cold side. d is depth. Compatibility with my previous terminology: CF = CHF and EF = CSR.

As you can see all of these profiles have the same geothermal heat flux (CF), and all of them produce a very different emergent flux (EF) out of the surface. The popularly stated geothermal heat flux is NOT a value that you can compare to insolation. The value itself gives you NO clue as to what can emerge at the top. Anyone telling you otherwise is stupid or lying.

The geothermal heat flux and the thermal conductivity factor determines the temperature gradient. A gradient can never tell you either what kinetic energy is at the bottom or the top. Never.

So what really emerges at the top on Earth? In this visualization, the closest answer is ~5°C or ~340 W/m² – what was calculated and observed here and here. ~340 W/m² is what is claimed for the total greenhouse gas backradiation effect, as shown in the “official” energy budget here. That’s not surprising, because the greenhouse gas effect is secretly just geothermal flipped upside down. It’s the biggest scam in climate science, and you heard it here first.

Geothermal provides a tremendous amount of energy, even more than the sun, but climate scientists ignore it because they are looking at a component of a gradient/slope measure, rather than the temperature (kinetic energy) it delivers to the surface.

I invite everyone to give this some serious thought and not just dismiss it using sophistry.

Love, -Zoe

Extra

  1. Geothermal Heat Flux (CF) is a very useful value for commercial geothermal energy prospectors, but not for atmospheric scientists creating an energy budget. EF is what they need to use. They do use it, but they flip it upside down and call it GHE.
  2. The temperature gradient value used is 27.5 °C/km, which I got from here: “it is about 25–30 °C/km”. This makes k = 0.333 W/(m*K).

The Strange Case of Mimas

Mimas is a small moon of Saturn. It is most famous for being the inspiration for the Death Star in the popular movie Star Wars.

Mimas

But from this day it will be famous for refuting mainstream climate science.

How you ask?

Well … let’s examine its external energy sources:

1) Insolation. The insolaton at Mimas should be approximately the same as that for Saturn.

2) Saturn. Radiation received from Saturn should equal the emission from Saturn diluted by the square of the radius of Saturn divided by distance from Saturn to Mimas and divided by 4.

NASA’s Facts Sheets (Saturn, Saturn Satellites) provides us all the numbers we need.

Apply standard formulas:

1) 14.82 * (1 - 0.6) / 4 = 1.482 W/m²

2) (5.67e-8)*(81)^4 * (54364/185539)^2 / 4 = 0.0524 W/m²

The total is 1.5344 W/m²

Let’s convert that back to a temperature (assuming emissivity = 1, by [Howett 2003]):

(1.5344/5.67e-8)^0.25 = 72.1

According to mainstream climate science, only special gases in the atmosphere can boost the surface temperature beyond what external radiation (the sun) alone can do. On Earth, they claim these gases boost the surface temperature by ~33°K.

Mimas has no greenhouse gases or even an atmosphere, so its average temperature should never exceed 72.1 K.

But in reality …

https://www.nasa.gov/mission_pages/cassini/multimedia/pia12867.html

It looks like PacMan is powering the Death Star and the surface temperature is boosted from 2 to 24 K beyond what external radiation alone can do. There is nothing below 74 K?

Isn’t it obvious that Mimas is geothermally boosted?

Neither the greenhouse effect theory of mainstream climate science or the atmospheric pressure theory of Nikolov & Zeller, et al can explain this!

Nothing else can explain PacMan and the thermal boost other than geothermal.

And if a tiny planetoid like Mimas has its own oddly distributed internal energy, maybe the Earth, which is 158,730 times more massive could as well?

Think!

Love, -Zoe

Lunar Warming

In a previous article, I examined the average moon temperature (AMT). You may have noticed that there’s been about ~3 degree K warming in the last decade.

According to [Vasavada 2012], the mean equatorial temperature between 2009 and 2011 was about 213K, whereas the 2017-2018 data from UCLA and WUSTL shows that to be about 216K.

For AMT, the increase has been from ~197K to ~200K.

Perhaps there is some error in the exactness, but that the moon has warmed is not actually controversial; it is accepted by mainstream scientists. I wanted to share with you today their theory as to the cause. Are you ready?

Google “lunar warming”. Here is what you will get:

Mainstream Nonsense

Livescience reports:

According to the new study, the 12 Apollo astronauts who walked on the moon between 1969 and 1972 kicked aside so much dust that they revealed huge regions of darker, more heat-absorbing soil that may not have seen the light of day in billions of years. Over just six years, this newly exposed soil absorbed enough solar radiation to raise the temperature of the entire moon’s surface by up to 3.6 degrees F (2 degrees C), the study found.

Livescience

You got that? They didn’t just raise the temperature where they walked but the ENTIRE moon!

You buy it? I hope not. Great laugh, right?

What is the ratio of surface area walked to the entire moon? I don’t know, but it’s ultra tiny. Seems like heat capacity calculations were ignored. The walked surface area might have to be millions (if not billions) of degrees to raise the entire surface area of the moon by a single degree – ASSUMING there’s horizontal heat transfer via conduction.

Now why would they say something that absurd?

I’ll tell you. Scientists have known that Total Solar Irradiance has been decreasing since the 1950s, and the moon has virtually no atmosphere. Because there is no atmosphere there can’t be any stupid greenhouse effect at work.

That would leave geothermal (lunathermal, I guess) warming as the only culprit!

And if the surface of the moon can warm up due to more internal energy coming up from beneath the surface, perhaps the same thing can be at work on Earth …

Think about it, Occam’s Razor sharp … (answer)

-Zoe

Do blankets warm you?

Believers of the Greenhouse Effect all use the same analogy to get you to believe in their junk science. The site Skeptical Science sets the standard in this article:

So have climate scientists made an elementary mistake? Of course not! The skeptic is ignoring the fact that the Earth is being warmed by the sun, which makes all the difference.

To see why, consider that blanket that keeps you warm. If your skin feels cold, wrapping yourself in a blanket can make you warmer. Why? Because your body is generating heat, and that heat is escaping from your body into the environment. When you wrap yourself in a blanket, the loss of heat is reduced, some is retained at the surface of your body, and you warm up. You get warmer because the heat that your body is generating cannot escape as fast as before.

Link

And more:

To summarise: Heat from the sun warms the Earth, as heat from your body keeps you warm. The Earth loses heat to space, and your body loses heat to the environment. Greenhouse gases slow down the rate of heat-loss from the surface of the Earth, like a blanket that slows down the rate at which your body loses heat. The result is the same in both cases, the surface of the Earth, or of your body, gets warmer.

Link

NASA reminds us that:

The greenhouse effect is the way in which heat is trapped close to the surface of the Earth by “greenhouse gases.” These heat-trapping gases can be thought of as a blanket wrapped around the Earth, which keeps it toastier than it would be without them.

Link

You got that? Blankets warm you! Their logic is so sound that they couldn’t possibly be wrong, could they?

What empirical evidence do they provide for such an assertion? None!

Do they even attempt to predict what temperature a blanket could force? No!

Any such attempt would be very embarrassing for them, so instead they just leave it to the reader’s imagination.

First a note: there is no doubt that a blanket can make you warmer by blocking convection. The issue at hand is whether there is a warming due to radiative heat transfer, as is claimed for the greenhouse effect by analogy.

Let’s consider the case of a typical cotton blanket, whose emissivity ranges from 0.81 to 0.88 [Bellivieu 2019], depending on humidity. I will choose 0.85 for an average humidity condition; The exactness hardly matters. According to the verified program provided in my article The Dumbest Math Theory Ever, a blanket with an emissivity of 0.85 placed on a human being whose normal temperature is at 37°C, should produce a final skin temperature of …

$ ALB=0 TSI=2090.8 bash gheffect 0.85

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 522.700 W |  36.701 C | 444.295 W | 222.148 W | 300.553 W
  2 | 744.848 W |  65.389 C | 410.973 W |  94.413 W | 428.287 W
  3 | 839.260 W |  75.642 C | 396.811 W |  40.125 W | 482.575 W
  4 | 879.386 W |  79.738 C | 390.792 W |  17.053 W | 505.647 W
  5 | 896.439 W |  81.436 C | 388.234 W |   7.248 W | 515.452 W
  6 | 903.687 W |  82.151 C | 387.147 W |   3.080 W | 519.620 W
  7 | 906.767 W |  82.453 C | 386.685 W |   1.309 W | 521.391 W
  8 | 908.076 W |  82.582 C | 386.489 W |   0.556 W | 522.144 W
  9 | 908.632 W |  82.636 C | 386.405 W |   0.236 W | 522.464 W
 10 | 908.869 W |  82.659 C | 386.370 W |   0.100 W | 522.600 W
 11 | 908.969 W |  82.669 C | 386.355 W |   0.043 W | 522.657 W
 12 | 909.012 W |  82.673 C | 386.348 W |   0.018 W | 522.682 W
 13 | 909.030 W |  82.675 C | 386.345 W |   0.008 W | 522.692 W
 14 | 909.038 W |  82.676 C | 386.344 W |   0.003 W | 522.697 W
 15 | 909.041 W |  82.676 C | 386.344 W |   0.001 W | 522.699 W
 16 | 909.042 W |  82.676 C | 386.344 W |   0.001 W | 522.699 W
 17 | 909.043 W |  82.676 C | 386.344 W |   0.000 W | 522.700 W

82.6°C ! Really hot!

Note that I set the albedo to zero. This is because I figure any scattering of photons between human and blanket will find its path back to the human (and thus “should” cause warming), with very little leakage at the edges of the blanket. But let us be as generous as possible to climate alarmists and say the blanket has an albedo of 0.22 (The highest value found for cotton in scientific literature: Source 1, Source 2). What then?

$ ALB=0.22 TSI=2090.8 bash gheffect 0.85

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 407.706 W |  18.040 C | 346.550 W | 173.275 W | 234.431 W
  2 | 580.981 W |  44.999 C | 320.559 W |  73.642 W | 334.064 W
  3 | 654.623 W |  54.635 C | 309.513 W |  31.298 W | 376.408 W
  4 | 685.921 W |  58.484 C | 304.818 W |  13.302 W | 394.404 W
  5 | 699.222 W |  60.081 C | 302.823 W |   5.653 W | 402.053 W
  6 | 704.875 W |  60.752 C | 301.975 W |   2.403 W | 405.303 W
  7 | 707.278 W |  61.036 C | 301.614 W |   1.021 W | 406.685 W
  8 | 708.299 W |  61.157 C | 301.461 W |   0.434 W | 407.272 W
  9 | 708.733 W |  61.208 C | 301.396 W |   0.184 W | 407.522 W
 10 | 708.918 W |  61.230 C | 301.368 W |   0.078 W | 407.628 W
 11 | 708.996 W |  61.239 C | 301.357 W |   0.033 W | 407.673 W
 12 | 709.029 W |  61.243 C | 301.352 W |   0.014 W | 407.692 W
 13 | 709.043 W |  61.245 C | 301.349 W |   0.006 W | 407.700 W
 14 | 709.049 W |  61.245 C | 301.349 W |   0.003 W | 407.703 W
 15 | 709.052 W |  61.246 C | 301.348 W |   0.001 W | 407.705 W
 16 | 709.053 W |  61.246 C | 301.348 W |   0.000 W | 407.706 W
 17 | 709.054 W |  61.246 C | 301.348 W |   0.000 W | 407.706 W

61.2°C ! Still very hot.

OK, I’m now going to be extremely generous, and use an emissivity value of 0.5, which is not even scientifically justifiable, but let’s give the alarmists a huge advantage. What then?

$ ALB=0.22 TSI=2090.8 bash gheffect 0.5

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 407.706 W |  18.040 C | 203.853 W | 101.927 W | 305.780 W
  2 | 509.633 W |  34.746 C | 152.890 W |  25.482 W | 382.224 W
  3 | 535.114 W |  38.525 C | 140.149 W |   6.370 W | 401.336 W
  4 | 541.485 W |  39.448 C | 136.964 W |   1.593 W | 406.113 W
  5 | 543.077 W |  39.678 C | 136.167 W |   0.398 W | 407.308 W
  6 | 543.475 W |  39.735 C | 135.968 W |   0.100 W | 407.606 W
  7 | 543.575 W |  39.750 C | 135.919 W |   0.025 W | 407.681 W
  8 | 543.600 W |  39.753 C | 135.906 W |   0.006 W | 407.700 W
  9 | 543.606 W |  39.754 C | 135.903 W |   0.002 W | 407.704 W
 10 | 543.607 W |  39.754 C | 135.902 W |   0.000 W | 407.706 W
 11 | 543.608 W |  39.754 C | 135.902 W |   0.000 W | 407.706 W

Now we get only 39.8°C, for a total warm up of 2.8°C – by a blanket that can only be heated by the human, and starts off colder (or same) as the human.

So is there any evidence to support the heating of human skin by a passively heated blanket via backradiation ?

However, if a cotton blanket heated to 90°C is in contact with skin the patient does not experience the same tissue injuries, because the blanket has less than one third the specific heat of skin. In addition, the blanket has less than 1/1000 the density of skin (the density of a blanket is about 1 kg/m³ because it is roughly half cotton and half air.) The blanket can give up all of its heat to the skin yet raise the temperature no more than 1/80th of the 70°C temperature difference, or about 1°C.

[ House 2011 ]

This scientist rightfully does not acknowledge warming by radiative effect. The blanket must be theoretically warmed to 90°C to achieve a rise of about 1°C. A table of empirical results is also provided in [House 2011]:

Body PartUnheated BlanketsBlankets Warmed to 43.3°CBlankets Warmed to 65.6°C
Abdomen0.17°C1.11°C2.39°C
Lower Legs0.33°C0.89°C1.11°C
[ House 2011], Table 2, Converted to Celcius

Though there is obviously a tiny amount of warming due to blocking convection, we don’t see any warming as predicted by GH effect radiative heat transfer theory. We should’ve seen a very generous 2.8°C warming as predicted by such a theory in the column Unheated Blankets. We don’t even see such a high number with blankets externally heated to 65.6°C !

Now we move onto [Kabbara 2002]. In this paper we see how expensive equipment can be used to maintain a patient’s temperature. Figure 6 shows how externally heated air prevents a patient’s temperature from falling. But one may ask: What is the purpose of this expensive equipment when climate “scientists” already know that a non-externally heated blanket should raise skin temperature by at least the very generous 2.8°C?

Would you trust these climate “scientists” with your health? Do you think they really believe what they claim?

And now we move onto: US Patent – US6078026A

The blanket A has a maximum power draw of 6.5 amps. With fully charged batteries, the blanket will reach its target temperature (i.e. 100 degrees Fahrenheit or 38 degrees Celsius) approximately 5 minutes and will remain heated for five to eight hours.

Patent US6078026A

An external power source to raise T to 38°C?

Why need external power or even a patent when a simple blanket ought to do the trick?

Please do not object to this article because I based this off a normal temperature of 37°C. Even a hypothermic temperature of 33°C should be raised by 2.72°C, IF the GH effect blanket analogy held any merit.

A search on google scholar for “hospital blankets temperature” should convince anyone with integrity that blankets don’t raise your skin temperature in accordance to radiative transfer theory. For if they did, most of the discussion and science in that search would be moot: human-only heated blankets would solve the problems and special technology would not be necessary.

Skeptical Science finishes off their article:

So global warming does not violate the second law of thermodynamics. And if someone tells you otherwise, just remember that you’re a warm human being, and certainly nobody’s dummy.

Link

I’ll translate that for you: If you believe their sophistry, you are a dummy!

While using poetic license it is alright to say that blankets warm you, but using actual science, it is not correct. The best a blanket can do is keep you warm, but never make you warmer.

Enjoy 🙂 -Zoe

Addendum

Blanket(s) can suppress your perspiration and make you sick from your own urea, thus causing your temperature to go up. However, this could never be a proper analogy for the greenhouse effect.

Geothermal Animated

Geothermal Emission @ the Surface

This was derived from NCEP Reanalysis data, in the tradition of Measuring Geothermal …

Enjoy 🙂 -Zoe

Addendum

geochg.sh:

# source geochg.sh
# Zoe Phin 2020/03/13
    
F=(0 ulwrf dswrf uswrf lhtfl shtfl)                                                  
O=(0 3201.5 3086.5 3131.5 856.5 2176.5)

require() { sudo apt install nco gnuplot imagemagick; } # Linux Only
    
download() {
    b="ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/gaussian_grid"
    for i in ${F[*]}; do wget -O $i.nc -c $b/$i.sfc.mon.mean.nc; done
}

extract() {
    for t in {000..491}; do echo "$t" >&2
        for i in {1..5}; do 
            ncks --trd -HC ${F[$i]}.nc -v ${F[$i]} -d time,$t | sed \$d | awk -F[=\ ] -vO=${O[$i]} '{ 
                printf "%7s %7s %7.3f\n", $4, $6, $8/10+O }' > .f$i
        done
        paste .f1 .f2 .f3 .f4 .f5 | awk '{ 
            printf "%s %s %7.3f\n", $1, $2, $3-($6-$9)+$12+$15 }' > .geo$t
    done
}

annualize() {
    for y in {0..40}; do 
        args=`for m in {0..11}; do printf ".geo%03d " $((12*y+m)); done`
        paste $args | awk '{ a=0; for (i=3;i<=NF;i+=3) a+=$i; print $1" "$2" "a/12 }' > .y$((1979+y))
    done
}

colorize() {
    range=(`sort -nk 3.1 .y* | awk 'NR==1{min=$3} END { print min" "$3 }'`)
    echo ${range[*]}
    for y in {1979..2019}; do awk -vmin=${range[0]} -vmax=${range[1]} 'BEGIN { dlt=max-min }
        {   if ($2 < 191) {$2+=169} else {$2-=191} 
            printf "%s %s %4d\n", $1, $2, 1023*($3-min)/dlt }' .y$y | awk 'BEGIN { n=0
            for (i=255; i>=0; i--) { pal[n] = sprintf("%d 0 255", i); n++ }
            for (i=0; i<=255; i++) { pal[n] = sprintf("0 %d %d", i, 255-i); n++ }
            for (i=0; i<=255; i++) { pal[n] = sprintf("%d 255 0", i); n++ }
            for (i=255; i>=0; i--) { pal[n] = sprintf("255 %d 0", i); n++ }
        } { 
            printf "%s %s %s\n", $2, $1, pal[$3] }
        ' > .c$y
    done
}

scale() {
    rm -f .scale
    range=(`sort -nk 3.1 .y* | awk 'NR==1{min=$3} END { printf "%d %d %d\n", min, $3, $3-min }'`)

    min=${range[0]}; max=${range[1]}; dlt=${range[2]}

    for h in {0..100}; do 
        seq 0 1023 | awk -vh=$h -vmin=$min -vdlt=$dlt 'BEGIN { n=0
                for (i=255; i>=0; i--) { pal[n] = sprintf("%d 0 255", i); n++ }
                for (i=0; i<=255; i++) { pal[n] = sprintf("0 %d %d", i, 255-i); n++ }
                for (i=0; i<=255; i++) { pal[n] = sprintf("%d 255 0", i); n++ }
                for (i=255; i>=0; i--) { pal[n] = sprintf("255 %d 0", i); n++ }
            } { 
            print $1*dlt/1023+min" "h" "pal[$1]
        }' >> .scale 
    done

    echo "set term jpeg size 740,140; set nokey; 
        set title 'Flux (W/m²)
        set xtics 100 out nomirror
        unset ytics; set noborder
        set xrange [$min:$max]; set yrange [0:100]
        rgb(r,g,b) = int(r)*65536 + int(g)*256 + int(b)
        plot '.scale' u 1:2:(rgb(\$3,\$4,\$5)) w dots lc rgb variable lw 1
    " | gnuplot > scale.jpg
}

plot() {
    for y in {1979..2019}; do echo $y >&2; echo "  
        set term jpeg size 740,420; set nokey
        set title '$y'
        set yrange [-180:180]; set xrange [0:720]
        set noborder; unset colorbox 
        unset xtics; unset ytics
        rgb(r,g,b) = int(r)*65536 + int(g)*256 + int(b)
        plot '.c${y}' u (\$1*2):(\$2*2):(rgb(\$3,\$4,\$5)) pt 5 ps 1 lc rgb variable
        " | gnuplot > c$y.jpg
    done
}

animate() {
    convert -loop 0 -delay 50 c*.jpg geoanim.gif
}

clean() { rm -f .geo* .[fyc]* .scale; }

Run it:

$ source geochg.sh
$ require  # Linux Only
$ download
$ extract
$ annualize
$ colorize
$ scale
$ plot
$ animate

Windows users need imagemagick package from Cygwin.

What caused 40 years of global warming?

I’m going to ignore the typical nonsense mainstream narrative, and do this analysis in the tradition of: Measuring Geothermal – A Revolutionary Hypothesis.

I will use 41 years of NCEP Reanalysis Data. Create a new file fluxchange.sh, and paste:

# source fluxchange.sh
# Zoe Phin 2020/03/10
    
F=(0 ulwrf dswrf uswrf lhtfl shtfl)                                                  
O=(0 3201.5 3086.5 3131.5 856.5 2176.5)

require() { sudo apt install nco gnuplot; } # Linux Only
    
download() {
    b="ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/gaussian_grid"
    for i in ${F[*]}; do wget -O $i.nc -c $b/$i.sfc.mon.mean.nc; done
}

extract() {
    rm -f .fx*
    for i in {1..5}; do echo $i of 5 >&2
        for t in {000..491}; do
            ncks --trd -HC ${F[$i]}.nc -v ${F[$i]} -d time,$t | sed \$d | awk -F[=\ ] -vO=${O[$i]} -vt=$t '{ 
                W[$4]+=$8/10+O } END { for (L in W) { T += W[L]/192*cos(L*atan2(0,-1)/180) }
                printf "%04d %02d %7.3f\n", t/12+1979, t%12+1, T/60.1647 }'
        done | tee -a .fx$i
    done
}

annualize() {
    for i in {1..5}; do
        awk '{ T[$1]+=$3 } END { for (y in T) printf "%04d %7.3f\n", y, T[y]/12 }' .fx$i > .af$i
    done
}

change() {
    paste .af1 .af2 .af3 .af4 .af5 | awk '{ 
        printf "%s %s %s %s %s %s | %7.3f %7.3f %7.3f\n", 
            $1, $2, $4, $6, $8, $10, $2+$8+$10, $4-$6, $2-($4-$6)+$8+$10 }' | tee fluxchg.csv | awk '
        NR==1 { Ui=$2; Ni=$5+$6; Si=$9; Gi=$10 } END { 
            dU=$2-Ui; dN=$5+$6-Ni; dS=$9-Si; dG=$10-Gi
            printf "Upwelling Change:\t%7.3f W/m^2\n", dU
            printf "NonRadiative Change:\t%7.3f W/m^2\n\n", dN
            printf "Net Solar Change:\t%7.3f W/m^2\n", dS
            printf "Geothermal Change:\t%7.3f W/m^2\n", dG
            
        }'
}

plot() {
    echo "set term png size 740,550 font 'arial,12'; unset key; set grid
    plot 'fluxchg.csv' u 1:9 t 'Net Solar' w lines lw 3 lc rgb 'orange'" | gnuplot > slrchg.png

    echo "set term png size 740,550 font 'arial,12'; unset key; set grid
    plot 'fluxchg.csv' u 1:10 t 'Geothermal' w lines lw 3 lc rgb 'green'" | gnuplot > geochg.png
}

Run it:

$ source fluxchange.sh
$ require                 # linux only
$ download
$ extract
$ annualize
$ change

Upwelling Change:         3.401 W/m^2
NonRadiative Change:      4.784 W/m^2

Net Solar Change:         1.419 W/m^2
Geothermal Change:        6.766 W/m^2

The results are changes for years 1979 to 2019 (inclusive). The upwelling radiation flux and non-radiative flux equivalent has changed 3.401+4.784 = 8.185 W/m², and the attribution is properly divided among

  1. The change in insolation (primarily due to reduced cloud cover) – 1.419 W/m²
  2. Internal geothermal changes within the Earth – 6.766 W/m²

The crackpot mainstream greenhouse gas theory lacks empirical evidence, and yet its followers have the nerve to claim that humans are mostly responsible for recent warming. Nonsense. The cause was always #1 and #2.

Plot results:

$ plot

Two new files created: slrchg.png and geochg.png

Net Solar @ Surface (W/m²)
Geothermal @ Surface (W/m²)

Enjoy 🙂 – Zoe

Addendum

$ cat fluxchg.csv

1979 395.786 186.921 26.919 87.056 7.335 | 490.177 160.002 330.175
1980 396.248 186.179 27.066 88.049 7.452 | 491.749 159.113 332.636
1981 395.826 186.227 26.999 87.365 6.982 | 490.173 159.228 330.945
1982 395.207 187.333 26.994 87.962 7.934 | 491.103 160.339 330.764
1983 396.148 187.429 27.092 87.688 7.736 | 491.572 160.337 331.235
1984 395.186 187.830 26.942 86.853 7.945 | 489.984 160.888 329.096
1985 394.914 187.021 27.373 86.890 7.407 | 489.211 159.648 329.563
1986 395.503 186.138 26.799 88.883 7.567 | 491.953 159.339 332.614
1987 396.042 186.786 27.074 89.541 7.123 | 492.706 159.712 332.994
1988 396.403 185.917 26.844 88.428 7.598 | 492.429 159.073 333.356
1989 395.692 187.332 26.878 88.224 7.659 | 491.575 160.454 331.121
1990 396.751 185.962 26.491 88.830 7.405 | 492.986 159.471 333.515
1991 396.649 186.832 26.841 88.516 7.939 | 493.104 159.991 333.113
1992 395.438 187.175 27.030 89.625 8.814 | 493.877 160.145 333.732
1993 395.298 187.121 26.921 89.680 8.397 | 493.375 160.200 333.175
1994 395.859 187.384 27.092 90.347 8.459 | 494.665 160.292 334.373
1995 396.609 186.948 26.720 90.761 8.292 | 495.662 160.228 335.434
1996 395.938 186.889 27.293 92.229 8.509 | 496.676 159.596 337.080
1997 396.798 187.287 26.950 92.895 7.849 | 497.542 160.337 337.205
1998 397.931 186.646 26.847 92.945 7.873 | 498.749 159.799 338.950
1999 396.517 187.779 26.880 92.037 8.052 | 496.606 160.899 335.707
2000 396.340 187.967 27.053 94.193 7.936 | 498.469 160.914 337.555
2001 397.321 187.681 26.674 94.724 7.890 | 499.935 161.007 338.928
2002 397.710 188.238 26.708 94.694 7.949 | 500.353 161.530 338.823
2003 397.804 188.196 26.783 95.019 7.977 | 500.800 161.413 339.387
2004 397.397 187.912 26.836 95.386 7.933 | 500.716 161.076 339.640
2005 398.228 187.168 26.577 93.679 7.619 | 499.526 160.591 338.935
2006 397.815 187.226 26.505 93.671 6.732 | 498.218 160.721 337.497
2007 397.605 186.780 26.632 93.904 6.247 | 497.756 160.148 337.608
2008 397.106 188.004 27.021 92.936 6.565 | 496.607 160.983 335.624
2009 397.900 187.515 26.678 93.026 7.348 | 498.274 160.837 337.437
2010 398.153 186.115 26.535 94.463 6.238 | 498.854 159.580 339.274
2011 397.179 186.900 26.747 93.863 6.269 | 497.311 160.153 337.158
2012 397.674 187.437 26.768 93.584 6.606 | 497.864 160.669 337.195
2013 397.805 187.167 26.953 93.387 6.382 | 497.574 160.214 337.360
2014 398.137 187.495 26.908 94.028 6.460 | 498.625 160.587 338.038
2015 398.972 187.381 26.727 94.254 6.797 | 500.023 160.654 339.369
2016 399.599 185.911 25.952 93.387 6.276 | 499.262 159.959 339.303
2017 399.039 186.358 26.212 94.697 6.395 | 500.131 160.146 339.985
2018 398.596 187.198 26.528 94.232 6.356 | 499.184 160.670 338.514
2019 399.187 187.909 26.488 93.321 5.854 | 498.362 161.421 336.941
Column 1Year
Column 2Earth Longwave Upwelling
Column 3Solar Shortwave Downwelling
Column 4Solar Shortwave Upwelling
Column 5Latent Heat
Column 6Sensible Heat
Column 7Total Equivalent Received by Atmosphere
Column 8Net Solar (Shortwave Down minus Up)
Column 9Geothermal (Columns: #2 – (#3 – #4) + #5 + #6 )
Columns

Dumbest Math Theory Ever

Mainstream climate scientists believe in the dumbest math theory ever devised to try and explain physical reality. It is called the Greenhouse Effect. It’s so silly and unbelievable that I don’t even want to give it the honor of calling it a scientific theory, because it is nothing but ideological mathematics that has never been empirically validated. In fact it is nothing but a post hoc fallacy: the surface is hotter than what the sun alone can do, therefore greenhouse gases did it!

Today we will play with this silly math theory called the greenhouse effect. Here are two examples of its typical canonical depiction:


Let’s get started. Please create a new file called gheffect, and paste the following into it:

# bash gheffect
# Zoe Phin, 2020/03/03

[ -z $TSI ] && TSI=1361
[ -z $ALB ] && ALB=0.306

echo $1 | awk -vALB=$ALB -vTSI=$TSI 'BEGIN { 
		SIG = 5.67E-8 ; CURR = LAST = SUN = TSI*(1-ALB)/4
		printf "Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space\n"
	} {
	for (i=1 ;; i++) {
		printf "%3d | %7.3f W | %7.3f C ", i, CURR, (CURR/SIG)^0.25-273.16

		CURR = SUN + $1*LAST/2 ; GHE = SUN - (LAST*(1-$1))

		printf "| %7.3f W | %7.3f W | %07.3f W\n", GHE, CURR-LAST, CURR-GHE

		if ( sprintf("%.3f", CURR) == sprintf("%.3f", LAST) ) break

		#if ( CURR==LAST ) break

		LAST = CURR
	}
}'

Now run it with atmospheric emissivity = 0.792:

$ bash gheffect 0.792

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 236.133 W | -19.125 C | 187.018 W |  93.509 W | 142.625 W
  2 | 329.642 W |   2.971 C | 167.568 W |  37.030 W | 199.104 W
  3 | 366.672 W |  10.419 C | 159.866 W |  14.664 W | 221.470 W
  4 | 381.336 W |  13.212 C | 156.816 W |   5.807 W | 230.327 W
  5 | 387.142 W |  14.296 C | 155.608 W |   2.300 W | 233.834 W
  6 | 389.442 W |  14.722 C | 155.130 W |   0.911 W | 235.223 W
  7 | 390.352 W |  14.890 C | 154.940 W |   0.361 W | 235.773 W
  8 | 390.713 W |  14.957 C | 154.865 W |   0.143 W | 235.991 W
  9 | 390.856 W |  14.983 C | 154.835 W |   0.057 W | 236.077 W
 10 | 390.912 W |  14.994 C | 154.824 W |   0.022 W | 236.111 W
 11 | 390.935 W |  14.998 C | 154.819 W |   0.009 W | 236.125 W
 12 | 390.944 W |  14.999 C | 154.817 W |   0.004 W | 236.130 W
 13 | 390.947 W |  15.000 C | 154.816 W |   0.001 W | 236.132 W
 14 | 390.949 W |  15.000 C | 154.816 W |   0.001 W | 236.133 W

W is shorthand for W/m². Parameters are taken from NASA Earth Fact Sheet.

As you can see, by delaying outgoing radiation for 14 [¹] seconds [²], we have boosted surface up-welling radiation by an additional ~66% (154.8/236.1 W/m²). Amazing, right? That’s what my program shows, and that’s what is claimed:

This is zero in the absence of any long‐wave absorbers, and around 155 W/m² in the present‐day atmosphere [Kiehl and Trenberth, 1997]. This reduction in outgoing LW flux drives the 33°C greenhouse effect …

Attribution of the present‐day total greenhouse effect

The main prediction of the theory is that as the atmosphere absorbs more infrared radiation, the surface will get warmer. Let’s rerun the program with a higher atmospheric emissivity = 0.8

$ bash gheffect 0.8

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 236.133 W | -19.125 C | 188.907 W |  94.453 W | 141.680 W
  2 | 330.587 W |   3.168 C | 170.016 W |  37.781 W | 198.352 W
  3 | 368.368 W |  10.746 C | 162.460 W |  15.113 W | 221.021 W
  4 | 383.481 W |  13.614 C | 159.437 W |   6.045 W | 230.088 W
  5 | 389.526 W |  14.738 C | 158.228 W |   2.418 W | 233.715 W
  6 | 391.944 W |  15.184 C | 157.745 W |   0.967 W | 235.166 W
  7 | 392.911 W |  15.361 C | 157.551 W |   0.387 W | 235.747 W
  8 | 393.298 W |  15.432 C | 157.474 W |   0.155 W | 235.979 W
  9 | 393.453 W |  15.461 C | 157.443 W |   0.062 W | 236.072 W
 10 | 393.515 W |  15.472 C | 157.431 W |   0.025 W | 236.109 W
 11 | 393.539 W |  15.477 C | 157.426 W |   0.010 W | 236.124 W
 12 | 393.549 W |  15.478 C | 157.424 W |   0.004 W | 236.130 W
 13 | 393.553 W |  15.479 C | 157.423 W |   0.002 W | 236.132 W
 14 | 393.555 W |  15.479 C | 157.423 W |   0.001 W | 236.133 W

A 1% rise in atmospheric emissivity (0.8/0.792) predicts a 0.479 °C rise in surface temperature.

You would think such intelligent and “correct” mathematics would be based on actual experiments, but you would be wrong; it is not based on anything other than its presuppositions, and has been so for more than a century by name, and two centuries by concept.

Let’s outline a very simple experiment to test whether the greenhouse effect is true:

          Solid Surface
               v

1) Person   => |     IR Camera

2) Person   <- | ->  IR Camera

And repeats until "equilibrium"

Radiation leaves the body and strikes a screen. After absorption some radiation will go out to the IR camera, and the rest will go back to the person, thereby warming them up further, according to greenhouse effect theory. Note that we don’t even need absorption, merely reflecting back a person’s radiation should warm them up.

Let’s assume the human body emits 522.7 W/m² (37 °C) (Emissivity: 0.9961, [Sanchez-Marin 2009]). For compatibility with my program, we multiply this figure by 4, and call it TSI. Let’s assume the screen and air in between together has a total emissivity of 0.9. Now run:

$ TSI=2090.8 bash gheffect 0.9
Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 362.754 W |   9.658 C | 326.478 W | 163.239 W | 199.515 W
  2 | 525.993 W |  37.188 C | 310.154 W |  73.458 W | 289.296 W
  3 | 599.451 W |  47.498 C | 302.809 W |  33.056 W | 329.698 W
  4 | 632.507 W |  51.830 C | 299.503 W |  14.875 W | 347.879 W
  5 | 647.382 W |  53.725 C | 298.016 W |   6.694 W | 356.060 W
  6 | 654.076 W |  54.566 C | 297.346 W |   3.012 W | 359.742 W
  7 | 657.088 W |  54.943 C | 297.045 W |   1.356 W | 361.398 W
  8 | 658.443 W |  55.112 C | 296.909 W |   0.610 W | 362.144 W
  9 | 659.053 W |  55.188 C | 296.848 W |   0.274 W | 362.479 W
 10 | 659.328 W |  55.222 C | 296.821 W |   0.124 W | 362.630 W
 11 | 659.451 W |  55.238 C | 296.809 W |   0.056 W | 362.698 W
 12 | 659.507 W |  55.244 C | 296.803 W |   0.025 W | 362.729 W
 13 | 659.532 W |  55.248 C | 296.801 W |   0.011 W | 362.743 W
 14 | 659.543 W |  55.249 C | 296.799 W |   0.005 W | 362.749 W
 15 | 659.548 W |  55.250 C | 296.799 W |   0.002 W | 362.752 W
 16 | 659.550 W |  55.250 C | 296.799 W |   0.001 W | 362.753 W
 17 | 659.552 W |  55.250 C | 296.799 W |   0.000 W | 362.753 W

We see that the screen is “trapping” a lot of human radiation from reaching the IR camera, and we expect an extra 296.8 W/m² greenhouse effect, bringing us up to 55°C. Merely placing a screen in front of us should make us feel as if we’re stepping inside a sauna.

https://youtu.be/fpx7hsoYEt4 – Look at all the trapped radiation!
https://youtu.be/Fx49t4sv7f0 – Look at all the trapped radiation!

These people must be really feeling the heat. But they don’t, and for good reason: preventing radiation from reaching a colder place does not cause heating back at the source. Had these people had thermometers strapped to them, they would note the virtually zero temperature rise (due to blocked convection). Look very closely at the videos. Note the seconds the screens are placed in front of their faces and notice the lack of any thermal reading changes. None!

All empirical evidence shows the opposite of the claims of the greenhouse effect.

So the question remains, why is the surface hotter than the sun can make it alone?

Energy Budget

If we look at the energy budget, we can see a dependency loop between surface and atmosphere: Surface -> Atmo = 350 and Atmo -> Surface = 324. So which came first, the chicken or the egg? This is nonsense. You can’t have a dependency loop for heat flow. Let’s try a theory that does not cause mental anguish and lacks empirical evidence. For this, we ignore the climate “scientists”, and go to the geophysicists:

https://www.routledgehandbooks.com/doi/10.1201/9781315371436-4

Here we see that Earth’s geothermal energy is capable of delivering 0 °C to the surface; This is equivalent to 315.7 W/m². We add the sun and subtract latent+sensible heat:

315.7 + 168 – 24 – 78 = 381.7 = Upwelling Radiadtion

Now we get a figure that that’s 390 – 381.7 = 8.3 W/m² off, but that’s OK because latent and sensible heat are not directly measured but estimated with certain physical assumptions, and/or the 0 °C geothermal is an approximation too.

Now we finally realize that the greenhouse effect is a hoax, and nothing but geothermal flipped up-side down. There is no Downwelling Radiation, there is only Upwelling-from-measurement-instrument Radiation (See here). Those who read Why is Venus so hot?, probably already saw where I was going. Now doesn’t it make more sense than backradiation temperature raising? Reality shows abolutely normal geothermal and solar combining to produce what we observe. We see all normal heating, and no ugly backwards zig-zag heating.

Let’s summarize:

     Upwelling
         ^
  |      |       ^        ^
  v      |       |        |
===============================
         |    Latent  Sensible
Solar ---+     Heat      Heat 
         |       ^        ^         
         |       |        |
         +------ Geothermal

Now which explanation does Occam’s Razor favor?

I hope you have enjoyed the return to sanity.

Sincerely, -Zoe

Notes

[¹] We only care about matching 3 decimal places. If we want to extend it to IEEE754 64-bit precision, it takes 40 seconds. Not that this matters much; Most work is accomplished in the first 5 seconds.

[²] I debated with myself whether to use the term seconds or iterations. Real physical calculations would take mass and heat capacity into account, but since greenhouse theorists don’t use these, I won’t either. Their simple model is in seconds.

Instructions for Windows Users

You can run all the code at this blog on Windows, rather than Linux. I will show you how. Preferred Windows is version 10. That’s the only one I’ve set up. Any Windows >7.0 should work in theory.


Run Windows Command Line (<Win>+R; type “cmd” then <enter>)

cd \
mkdir Zoe

Zoe is your variable. Make it whatever you want, remember it, then “exit” <enter>


Download Cygwin Setup Program. [Direct Link to 64-bit Exe]

Run Exe, click <Next>

Choose 1st
Set the root directory
Set to same previous directory
Choose a connection method, I choose 1st
Choose a mirror
Change View to “Full”

You will need 3 packages: wget, lua (5.3), gnuplot.

Type package name in search, then under New, CLICK on the word “Skip”, version number will appear

For lua, keep clicking “Skip/Version #” until version 5.3.x appears. Failure to do so will break gnuplot.

When done click <Next>. Next screen suggests dependencies; Just click <Next>, then wait through download & install process.

First is required, 2nd is your choice

Download Miniconda3. [Direct Link to 64-bit Exe]. Run Exe, click <Next>, click <I Agree>

Your choice. I do recommended.
Place it inside previous Cygwin directory for neatness
I select both. Caution if you are a python developer.

When complete, click <Next>, click <Next>, deselect both “Learn …”, click <Finish>

Double click

A new user home environment will be created.

The Path is “C:\Zoe\home\<Windows Username>”. All code can just go here.

Inside the Terminal:

$ conda config --add channels conda-forge
$ conda install nco numpy

Second command requires your confirmation. Type “y” and <enter>.

When done, exit with “exit” <enter>, and relaunch Cygwin Terminal.


Use any text editor you like (must be able to format Unix-style files), or get Sublime. [Direct Link to 64-bit Exe]

After install, run, and select: File > New File

Make sure to set View > Line Endings > Unix, otherwise script will not run!

Copy and paste the following sample code into the empty new file:

# source map.sh

download() {
	wget -c ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis/surface/land.nc
}

extract() {
	ncks --trd -HC land.nc | awk -F [=\ ] '{
		if ($6 < 191) {$6+=169} else {$6-=191}
		print $6" "$4" "$8
	}' > map.dat
}

plot() {
	echo 'set term png size 362,182; set nokey
		set yrange [-90:90]; set xrange [0:360]
		set noborder; unset colorbox 
		unset xtics; unset ytics
		set palette define (0 "blue", 1 "orange") 
		set margin 0,0,0,0
		plot "map.dat" u 1:2:3 pt 5 ps 1 lc palette' | gnuplot > map.png
}

Save this file to: C:\Zoe\home\<Windows Username>\map.sh

Now execute from Cygwin Terminal:

$ source map.sh
$ download
$ extract
$ plot

* or *

$ . map.sh && download && extract && plot

The result is a file called map.png inside C:\Zoe\home\<Windows Username>. Use Windows Explorer to find it and launch it.

If map.png looks like this … you are done! Most of my blog code will work.


Happy Coding 🙂 -Zoe