No Greenhouse Effect Here

In a previous article, I determined the average surface temperature of Earth without an atmosphere using a collection of equal-area pieces of our planet. What’s interesting is that many places, on average, are hotter without the so-called greenhouse effect than with it! And it’s not a small amount of places!

The following are latitude bands and the amount of area (km2) that have higher shortwave input than longwave output:

$ . isccp.sh ; lats

+1.25 540799
-1.25 1390626
-3.75 2081970
-6.25 2320620
-8.75 2555916
-11.25 2786508
-13.75 2856622
-16.25 3793286
-18.75 4958656
-21.25 6501264
-23.75 7253134
-26.25 7659135
-28.75 7587944
-31.25 7734300
-33.75 8095815
-36.25 8432349
-38.75 8445865
-41.25 8211184
-43.75 7806290
-46.25 7617951
-48.75 7182576
-51.25 6887621
-53.75 6502608
-56.25 5873052
-58.75 4387233
-61.25 2404670
-63.75 538307

$ . isccp.sh ; lats | awk '{A+=$2}END{print A}'

142406301

A latitude band is the latitude shown +/- 1.25 degrees.

The total area is 142,406,301 km2, which is nearly 28% of the entire Earth !

Data for all 6596 cells can be downloaded here. Where 3rd column is positive is what I’m discussing.

Common sense tells us that if the radiation-based greenhouse effect doesn’t work for nearly 28% of the planet, it probably doesn’t work anywhere at all.

Nearly the entire anti-greenhouse effect shown here is in southern hemisphere oceans.

That’s all. Enjoy 🙂

Code

# Zoe Phin, 2022/07/06
# File: isccp.sh
# Run: . isccp.sh ; download ; short | avg ; long | avg

grab() { wget -qO- $1 | od -An -tf4 -v -w4 --endian=big; }
download() {
    wget -O grid https://isccp.giss.nasa.gov/pub/tables/ISCCP.D1GRID.0.GLOBAL.1983.99.99.9999.GPC
    grab "https://isccp.giss.nasa.gov/pub/data/surface/D1AVGANN__BBEMIS" > em.is
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGSON__SWXXTOADW" > sw.in
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFDW" > sw.dn
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFUW" > sw.up
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGSON__LWFLSRFUW" > lw.up
}
avg() { awk '{S+=$1;N+=1}END{printf "%.5f\n",S/N}'; }
short() { 
    paste sw.in sw.dn sw.up em.is | awk '!/*/{
        printf "%.5f\n", ($1*(1-$3/$2)/$4/5.670367E-8)^0.25
    }'
}
long() { awk '{ printf "%.5f\n", ($1/5.670367E-8)^0.25}' lw.up; }	
compare() {
    short > .sh; long > .lo
    sed 1,2d grid > .ta
    paste .sh .lo .ta | awk '{
        printf "%.3f %.3f %+8.3f %d %+6.2f\n", $1, $2, $1-$2, $10, $8-90
    }'
}
lats() {
    compare | sed \$d | awk '$3~/+/{L[$5]+=$4} END {for(l in L)print l,L[l]}' | sort -rn
}

Shrinking GH Effect Closer to Reality

In a previous article, I attempted to figure out a more accurate surface temperature for an imaginary Earth with no atmosphere compared to the standard approach of completely neglecting surface emissivity. The formula I used is:

The result I got was 273.6 K. This gives me a theoretical greenhouse gas warming of about 16.1 K, or about half using the standard approach.

The problem with this result is that I used a globally averaged albedo and emissivity.

A more accurate way to do this is to break the up the Earth into small pieces and calculate local temperatures based on local insolation, local albedo, and local emissivity. Then average that.

The only project that I’ve managed to find that provides broadband emissivity data is NASA’s ISCCP. It’s a bit dated: 1983 to 2004, but why not use it?

The Earth is broken down into 6596 equal area pieces:

ISCCP Data Elements

The needed calculation can be summarized:

The result I got is:

$ . isccp.sh ; short | avg

268.82074

This is ~ -4.3°C.

For fair assessment, we also need to do the same type of operation on ISCCP’s upwelling longwave radiation:

$ . isccp.sh ; long | avg

287.34596

The difference is ~ 18.5 K.

Now let’s take a look at common propaganda taught to children:

Nope. The correct answer is -4.3°C !

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2022/07/01
# File: isccp.sh
# Run: . isccp.sh ; download ; short | avg ; long | avg
grab() { wget -qO- $1 | od -An -tf4 -v -w4 --endian=big; }
download() {
    grab "https://isccp.giss.nasa.gov/pub/data/surface/D1AVGANN__BBEMIS" > em.is
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGSON__SWXXTOADW" > sw.in
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFDW" > sw.dn
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFUW" > sw.up
    grab "https://isccp.giss.nasa.gov/pub/data/FC/FDAVGSON__LWFLSRFUW" > lw.up
}
avg() { awk '{S+=$1;N+=1}END{printf "%.5f\n",S/N}'; }
short() { 
    paste sw.in sw.dn sw.up em.is lw.up | awk '!/*/{
        printf "%.5f\n", ($1*(1-$3/$2)/$4/5.670367E-8)^0.25
    }'
}
long() { awk '{ printf "%.5f\n", ($1/5.670367E-8)^0.25}' lw.up; }

7 Decades of Net Solar Radiation

Imagine thinking that the Intergovernmental Panel on Climate (IPCC) is an actual legitimate scientific organization and seeing this diagram:

“Woah” you think to yourself. “Look at what humans have done!”

Notice, there is one natural factor: “Solar Irradiance”. But what they don’t tell you is that this is solar variation at the top of the atmosphere!

What’s been going on at the bottom of the atmosphere for 7 decades?

Net Solar Radiation @ Surface, Source

And then you realize what kind of a blatant fraud the IPCC really is.

3 W/m2 over ~70 years or 4 W/m2 over ~35 years completely wipes away their nonsense.

But facts won’t stop them from playing their little game of blaming those they want to punish, sad to say.

Enjoy 🙂 -Zoe

Alternate View

Code

# Zoe Phin, 2022/06/23
# File: ncep.sh
# Run: source ncep.sh; require; download; extract; plot
# Output: netsol.png

require() { sudo apt-get install -y gnuplot python3-xarray python3-netcdf4; }
download() { wget -O ncep.nc 'http://climexp.knmi.nl/NCEPNCAR40/nswrs.sfc.mon.mean.nc'; }
extract() { echo "import xarray as x; import numpy as n
a=6378.137; e=1-6356.752**2/a**2; r=n.pi/180
d = x.open_dataset('ncep.nc')['nswrs']
by_lat=(a*r)**2*(1-e)*n.cos(r*d.lat)/(1-e*n.sin(r*d.lat)**2)**2
for m in d.weighted(by_lat).mean({'lon','lat'}):
    print(m.values)" | python3 -u | awk '!/nan/{print 1948+NR/12+1/24" "$1}' | tee sol.mon
}
plot() {
    cat sol.mon | yoy 12 > sol.yoy
    paste sol.mon sol.yoy > sol.csv
    echo "set term png size 740,540
    set key outside top center horizontal
    set grid xtics ytics
    set xrange [1947:2023]
    set format y '%.1f'
    set ytics 1
    plot 'sol.csv' u 1:3 t 'NCEP/NCAR Net Solar (W/m²) - 12mo CMA' w l lw 2 lc rgb 'orange'
    " | gnuplot > netsol.png 
}
yoy() { awk '{printf "%s ",$2}' | awk -vp=$1 '{ h=p/2;
    for (i=0;i<h;i++) print ""
    for (i=h;i<=NF-h;i++) { s=0
        for (j=i-h+1;j<=i+h;j++) s+=$j/p
            printf "%8.6f\n", s
    } }'
}

Python NetCDF Latitude Area Weighted Global Average

The earth is not flat, and it’s not a sphere. If you’re doing data analysis on climate netcdf files with the Python programming language, you may need to figure out how to properly average gridded data over oblate spheroid Earth. Unfortunately there is few to no simple guides on how to do this properly. Searching google with “python netcdf latitude weighted area average” yielded pretty much nothing useful. Adding “tutorial” finally go me somewhere. It got me here: The Correct Way to Average the Globe .

The tutorial is okay, but the author does things in the most convoluted way possible. The amount of code made my eyes glaze over, and so I abandoned using Python for this task. But recently I had some time to think and I came up with a very short solution. It is in fact not very different from the Awk code I always used in many of my posts.

I hope this post will become the go-to guide for Python users on this question.

Here’s how it’s done:

# Zoe Phin, 2022/05/03
# File: berkland.py
# Run: python3 berkland.py
# Output: berk.png

import os
import requests as r
import xarray as x  
import numpy as n
from matplotlib import pyplot as m

if not os.path.exists('berk.nc'):
    open('berk.nc','wb').write(r.get('http://berkeleyearth.lbl.gov/auto/Global/Gridded/Complete_TAVG_LatLong1.nc').content)

d = x.open_dataset('berk.nc')['temperature']

a = 6378.137; e = 1-6356.752**2/a**2; r = n.pi/180

by_lat = (a*r)**2*(1-e)*n.cos(r*d.latitude)/(1-e*n.sin(r*d.latitude)**2)**2

w = d.weighted(by_lat).mean({'longitude','latitude'})
    
m.plot(w.time.values, w.values)
m.title('Land-Only Temperature Anomaly')
m.savefig('berk.png')

This is what we get:

Yes, this is correct.

We can easily do a sanity check. The sum area of all the grid cells should add up to the surface area of the Earth:

# File: area.py
import xarray as x  
import numpy as n

d = x.open_dataset('berk.nc')['temperature']

a = 6378.137; e = 1-6356.752**2/a**2; r = n.pi/180

by_lat = (a*r)**2*(1-e)*n.cos(r*d.latitude)/(1-e*n.sin(r*d.latitude)**2)**2

s=0
for l in by_lat.values:
    s += l*360
print(s)
$ python3 area.py

510072158.7487793

This is the correct area in km2 when using this technique.

Please note that this code works as is because the grid cell is 1×1 degrees. The way you modify it for different resolution is to change the by_lat line:

by_lat = (a*r)**2 ...      # 1 degree
by_lat = (a*r*0.5)**2 ...  # 0.5 degree
by_lat = (a*r*0.25)**2 ... # 0.25 degree
... 

That’s it.

Enjoy doing things the easy way.

-Zoe

Is Virology Pseudoscience?

You decide. Please watch this 7.5 hour film to find out why virology is pseudoscience.

The Viral Delusion (2022)

Press bottom right arrows square for fullscreen view. Popcorn recommended.

Don’t shoot the messenger. It took me 2 years to come to this view.

Enjoy 🙂 -Zoe

Update: You can also watch it in parts & better quality:

Part 1 – Behind The Curtain of The Pandemic. The Pseudoscience of SARS-COV2
Part 2 – Monkey Business. Polio, the Measles, and How It All Began…
Part 3 – The Mask of Death: Smallpox, The Plague and The Spanish Flu
Part 4 – AIDS, The Deadly Deception
Part 5 – Genetic Sequencing The Virus That Isn’t There

6 Decades of Snow Water Equivalent

Scientists developed a very high resolution dataset that goes back 6 decades: TerraClimate (Nature link). I am continuing my examination of their data (with model stuffing). Today I will look at their swe series from 1958 to 2021 (inclusive).

swe stands for Snow Water Equivalent.

Snow Water Equivalent – January 2021

Let’s see what the data shows …

I expected an increase given my previous article on global snow trend.

SWE ranges from 46 to 1196 mm. That would be 1150 mm over 64 years, or ~18mm/yr. Amazing!

We got an extra 1.15 meters of snow water equivalent, or 11.5 meters of snow over land in 6 decades, and all that extra man-made carbon dioxide failed to melt it. lol

Let’s look at month to month changes …

Difference from Month to Month

That spike is very interesting. What could it be? An error in the data? or maybe this? It is exactly in January 2016. I’m too lazy to investigate this further.

Let’s take a look now at the annualized trend …

Annualized Trend

1981 is the only year when SWE declined.

The trend clearly shows a growth in the accumulation of Snow Water Equivalent. To be more exact:

1959 to 2021 (Inclusive)
Annualized Linear Regression Trend: 1.308 to 1.607: +22.845%

Definite Conclusion: Carbon Dioxide radiative forcing doesn’t work at all in the arctic regions where the bulk of SWE is.

Warmists still have a way out by claiming that warming drives more water vapor to the poles. That’s fine. But please, could you not at the same time tell us that CO2 will melt snow and ice in the arctic directly via the radiative greenhouse effect? Thank you.

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2022/06/14
# File: swe.sh
# Run: source swe.sh; require; download; extract; plot

require() { sudo apt-get install -y gmt nco gnuplot python3-xarray python3-netcdf4; }
download() {
    for y in {1958..2021}; do
        wget -cO $y.nc "http://thredds.northwestknowledge.net:8080/thredds/fileServer/TERRACLIMATE_ALL/data/TerraClimate_swe_$y.nc"
    done
}
extract() { echo "import xarray as x; import numpy as n
    a=6378.137; e=1-6356.752**2/a**2; r=n.pi/180
    d = x.open_dataset('1958.nc')['swe']
    by_lat=(a*r/24)**2*(1-e)*n.cos(r*d.lat)/(1-e*n.sin(r*d.lat)**2)**2/59035.372
    for y in range(2021,2022):
        for m in x.open_dataset(str(y)+'.nc')['swe'].weighted(by_lat).mean({'lon','lat'}).values:
            print(y,m)
    " | sed 's/\t//1' | python3 -u | tee swe.tmp
}
parse.orig() { awk 'NR==1 { L=$2 } NR>1 { printf "%.3f %.6f\n", 1958+NR/12-1/24, $2 }' swe.tmp; }
parse() { awk 'NR==1 { L=$2 } NR>1 { printf "%.3f %+10.6f\n", 1958+NR/12-1/24, $2-L; L=$2 }' swe.tmp; }
annual() { awk '{S[substr($0,1,4)]+=$2/12} END {for (y in S) printf "%d %.6f\n",y,S[y]}'; }
yoy() { awk '{printf "%s ",$2}' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2+1;j<i+p/2;j++) s+=$j/p
            printf "%8.6f\n", s
    } }'
}
plot() { 
#	parse.orig > swe.csv
#	parse > swe; parse | yoy 240 > swe.yoy
#	paste -d ' ' swe swe.yoy > swe.csv
    echo -n "Annualized Linear Regression Trend: "
    parse | annual | gmt gmtregress | awk 'NR==2{S=$3}END{printf "%.3f to %.3f: %+.3f%\n", S, $3, 100*($3-S)/S}'
    parse | annual | gmt gmtregress > swe.csv
    echo "set term png size 740,540
    set key outside top center horizontal
    set xtics out; set ytics out
    set mxtics 10; set mytics 2
    set grid xtics ytics
    set ytics format '%.1f'
#	set yrange [-21:41]
    set xrange [1959:2021]
    plot 'swe.csv' u 1:2 t 'Snow Water Equivalent (mm)' w lines lw 1 lc rgb 'blue',\\
         'swe.csv' u 1:3 t 'Linear Regression Trend' w lines lw 2 lc rgb '#000044'
    "| gnuplot > swe.png 
}

6 Decades of Shortwave Radiation

Scientists developed a very high resolution dataset that goes back 6 decades: TerraClimate (Nature link). I will/may be examining their data (with model stuffing) in the next few posts.

The first thing I downloaded was the srad dataset, all 8.4GB of it, from 1958 to 2021.

srad means Shortwave Radiation (to Surface).

Solar Downwelling Radiation – July 2021

I was a little disappointed that it only covers land. No oceans. But let’s see what we can get out of it anyway …

These numbers suggest this is solar downwelling, not Net Solar (minus solar upwelling). Also dissapointed, but will do.

If this data with model filling is true, the ~3 W/m2 difference (~179 minus ~176) over 6 decades is WAY more than could even be hypothesized by carbon dioxide warming over the last 2 centuries.

Again we see that the solar shortwave is the dominant force in all the global warming we observed.

Sorry, times up.

Best regards, -Z

P.S. No, I have no idea what happened there in the early 1970s.

Code

# Zoe Phin, 2022/06/11
# File: srad.sh
# Run: source srad.sh; require; download; extract; plot

require() { sudo apt-get install -y gmt nco gnuplot python3-xarray python3-netcdf4; }
download() {
    for y in {1958..2021}; do
        wget -O $y.nc "http://thredds.northwestknowledge.net:8080/thredds/fileServer/TERRACLIMATE_ALL/data/TerraClimate_srad_$y.nc"
    done
}
extract() { echo "import xarray as x; import numpy as n
    a=6378.137; e=1-6356.752**2/a**2; r=n.pi/180
    d = x.open_dataset('1958.nc')['srad']
    by_lat=(a*r/24)**2*(1-e)*n.cos(r*d.lat)/(1-e*n.sin(r*d.lat)**2)**2/59035.372
    for y in range(1958,1960):
        for m in x.open_dataset(str(y)+'.nc')['srad'].weighted(by_lat).mean({'lon','lat'}).values:
            print(y,m)" | sed 's/\t//1' | python3 -u | tee .srad
}
parse() { awk '{ printf "%.3f %.6f\n", 1958+NR/12-1/24,$2 }' .srad; }
annual() { awk '{S[substr($0,1,4)]+=$2/12} END {for (y in S) printf "%d %.6f\n",y,S[y]}'; }
plot() { 
    parse > srad.mon
    parse | annual > srad.ann
    parse | yoy 12 > srad.yoy
    paste -d ' ' srad.mon srad.yoy > srad.csv
    echo "set term png size 740,740
        set key outside top center horizontal
        set grid xtics ytics
        set xrange [1958:2021]
        set format y '%.1f'
        plot 'srad.csv' u 1:3 t 'Shortwave Radiation to Surface (W/m²) - 12mo CMA' w l lw 2 lc rgb 'orange'
    "| gnuplot > srad.png 
}
yoy() { awk '{printf "%s ",$2}' | awk -vp=$1 '{ h=p/2;
    for (i=0;i<h;i++) print ""
    for (i=h;i<=NF-h;i++) { s=0
        for (j=i-h+1;j<=i+h;j++) s+=$j/p
            printf "%8.6f\n", s
    } }'
}

Shrinking the Atmospheric Greenhouse Effect closer to reality

Mainstream climate science claims that without an atmosphere, Earth’s surface temperature would be, on average, below freezing. It would be about 33 degrees C colder than it is with an atmosphere. Actually, they claim that without greenhouse gases (GHGs) that would be the case. Someone at NASA even made this ridiculous statement:

Remove carbon dioxide, and the terrestrial greenhouse effect would collapse. Without carbon dioxide, Earth’s surface would be some 33°C (59°F) cooler.

NASA: What is the greenhouse effect

Ha ha! I’m sure they meant to say GHGs, not CO2, but let’s laugh at NASA for employing someone secretly biased like this.

There’s a link on that site to UCAR (Center for Science Education): The Greenhouse Effect. Here we read:

Without the greenhouse effect, Earth’s temperature would be below freezing.

— UCAR

Clicking on that link:

Without any heat-trapping greenhouse gases in our atmosphere, Earth would be a frozen ball of ice.

You can learn lots more details about the math at our Calculating Planetary Energy Balance & Temperature web page.

— UCAR

Great, we finally got to the meat. I was looking for this math page, but apparently it’s gone now from UCAR and links to an archive.org page. Maybe they are embarrassed? Anyway, here’s how this “iceball” theory works:

The math is mostly fine (Earth is not a perfect Sphere), but the real problem is the choice of 0.31 albedo. Most of the albedo comes from the atmosphere itself! And also this albedo value is only useful in observing Earth from space. ~30% of incoming shortwave solar radiation is indeed reflected from the Earth onto an observing satellite. But this is NOT a metric for figuring out how much solar radiation reaches the surface, which is what is in question. And so this entire calculation is utterly superficial and meaningless.

Some have suggested to use the Moon’s albedo to simulate Earth’s surface temperature without an atmosphere. Moon’s albedo is about 0.12, and the standard calculation method would yield:

$ qalc -t '1361*(1-0.12)/4=5.670367e-8*x^4'

x ≈ 269.5674246   # (x = Temperature in Kelvin)

But the moon is the moon, and it’s not like the Earth, at least I don’t think.

Let us look at an Energy Budget diagram to see what the proper parameter is:

CERES Energy Budget (2005-2015)

So 23 W/m2 is reflected at the surface, and that’s from (23+164)= 187 W/m2 that makes it past the atmosphere. 23/187 = 0.1229…

Ah, so that is indeed very close to the moon. Not a bad assumption.

Then again, 23 out of a total of 340 was reflected at the surface. Maybe the 187 is irrelevant? Can I be sure that the surface would not have rejected the same as the atmosphere? No, really, can I? I don’t know, but I don’t think so. If I’m right, then the surface albedo would be 23/340 = 0.0676.

This gives me more confidence:

The lunar average Bond albedo (at normal solar incidence) A is 0.12 (Vasavada et al. 2012). This is in agreement with the mean value of 0.122 found by Saari & Shorthill (1972). Vasavada et al. (2012) derived a mean albedo of 0.07 for mare and 0.16 for highland surfaces from measurements taken by the Diviner Lunar Radio Experiment. In a NASA summary of the Moon’s bulk parameters19, the Bond albedo is given by 0.11 and the geometric albedo by 0.12

The Moon at Thermal Infrared …

Mares are generally smooth and flat and take up ~16% of moon’s surface. On Earth, our equivalent of lunar mares would be the oceans, and any body of water in general. Water bodies have a small albedo, like 0.06 … similar to lunar mares. But our “mares” take up 71% of our surface. This fact alone makes me confident Earth’s surface albedo is much lower than the moon.

Moving on…

The main problem with the mainstream no-atmosphere formula shown above is that it lacks EMISSIVITY, which is a huge mistake. Here’s the correction:

Now things get interesting.

What is the emissivity of the moon? This paper suggests it is definitely somewhere between 0.92 and 0.97. But this is computed from a narrow set of channels. Such an analysis on Earth also leads to a high result: 0.97-0.98, whereas the actual emissivity is found in my previous article: What is Earth’s Surface Emissivity? : 0.93643.

I would even bet that the moon’s albedo is just one minus its emissivity.

For Earth, let’s go back to the energy budget … 23/340 is 0.067647.

1 minus 0.067647 is 0.932353

0.932353 and 0.93643 are too close to be a coincidence. I think they are saying the same thing:

Screw albedo and emissivity, and just assume Kirchoff’s Law of Radiation for this type of calculation? This implies:

S/4=σT⁴

A more accurate S/4 value is taken from my previous article: 339.22

$ qalc -t '339.22=5.670367e-8*x^4'
x ≈ 278.1106181

If we take the surface outgoing longwave radiation from CERES: 399.56 (Average, year 2020), we get …

$ qalc -t '399.56=5.670367e-8*x^4'
x ≈ 289.7294947

And the difference is about 11.62 degrees C

Now that is much less of a “greenhouse effect” than the fantastical 33 or 34 degrees claimed by mainstream climate scientists!

Without the atmosphere, the average temperature would be nearly 5 degrees C

Have a good day. -Zoe

Update

I don’t know how I missed this chart:

Averaging the emissivity accross 7 channels yields:

(0.965+0.938+0.912+0.766+0.777+0.827+0.787)/7 = 

~ 0.853

That’s not exactly the right way to do it, as different channels yield different intensities via Planck’s Law, but I still think emissivity is very close if not exactly (1-albedo).

Update

Just in case I’m wrong, I’ll show the calculation for an albedo of 0.1229 (derived from energy budget):

$ qalc -t '339.22*(1-0.1229)=0.93643*5.670367e-8*x^4'
x ≈ 273.5968031 

The GH effect would then be 16.13 K.

20 Years of Climate Change

Hi! My name is Zoe, and I’m a highly respectable person and therefore a climate alarmist.

Humans are causing severe climate change and 99.999999+ bazillion percent of scientists agree. Only a few bad and uneducated people disagree. I will show them with sound reasoning why they are wrong.

Here’s the best evidence that humans are to blame:

First, there was a lot of warming over the last 20 years all over the planet. We can see this in the excellent CERES project data:

Surface Upwelling Longwave Radiation (W/m²)

2001 398.9112
2002 399.1996
2003 399.2499
2004 399.1482
2005 399.2486
2006 399.2572
2007 399.1854
2008 399.1008
2009 399.1028
2010 399.0994
2011 399.0505
2012 399.0608
2013 399.0746
2014 399.1138
2015 399.2117
2016 399.3063
2017 399.3712
2018 399.4171
2019 399.4999
2020 399.5614

Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation

Look at that! An additional ~ +0.65 W/m2 coming from the surface. Merely showing warming is proof that humans caused it. But if that’s not enough for deniers like you, let’s proceed further.

We all know that greenhouse gases trap heat and reduce outgoing radiation. Since obviously humans alone have risen the atmospheric CO2 content with their bad polluting, we should expect to see a decrease in outgoing radiation:

Upwelling Longwave Radiation to Space (W/m²):

2001 240.1311
2002 240.3947
2003 240.4664
2004 240.4265
2005 240.4231
2006 240.4333
2007 240.4387
2008 240.3824
2009 240.3761
2010 240.3857
2011 240.3767
2012 240.3637
2013 240.3613
2014 240.3720
2015 240.3935
2016 240.4188
2017 240.4379
2018 240.4397
2019 240.4655
2020 240.4967

Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation

See? All good …. Wait …

Adding GHGs didn’t reduce outgoing radiation to space? There was an increase? That’s weird.

Well, surely to raise surface temperatures, the increase of GHGs in the atmosphere must be emitting more to the surface!

Atmosphere Downwelling Radiation (W/m²):

2001 345.9088
2002 345.9536
2003 345.9887
2004 346.0057
2005 346.1214
2006 346.0831
2007 345.9341
2008 345.7438
2009 345.6955
2010 345.7130
2011 345.6133
2012 345.6087
2013 345.6394
2014 345.6938
2015 345.8217
2016 345.9240
2017 345.9850
2018 345.9551
2019 345.9548
2020 345.8976

Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation

Hmm … pretty much stayed flat. That can’t explain the warming.

What about the sun?

Incoming Solar Radiation (W/m²):

2001 339.3737
2002 339.3658
2003 339.3230
2004 339.2988
2005 339.2740
2006 339.2524
2007 339.2344
2008 339.2214
2009 339.2109
2010 339.2086
2011 339.2136
2012 339.2217
2013 339.2266
2014 339.2317
2015 339.2379
2016 339.2369
2017 339.2331
2018 339.2290
2019 339.2248
2020 339.2246

Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation

Ha ha, solar power decreased. Checkmate stupid deniers.

Hmm, but what could it be? Let me check something else …

Surface Downwelling Solar -
Surface Upwelling Solar =
Net Solar @ the Surface (W/m²)

2001 185.9094 - 23.4757 = 162.4337
2002 185.9978 - 23.4542 = 162.5436
2003 186.0969 - 23.4142 = 162.6827
2004 186.1657 - 23.4145 = 162.7512
2005 186.1106 - 23.3699 = 162.7407
2006 186.1072 - 23.3321 = 162.7751
2007 186.0732 - 23.3219 = 162.7513
2008 186.1493 - 23.3190 = 162.8303
2009 186.1456 - 23.3198 = 162.8258
2010 186.0461 - 23.2820 = 162.7641
2011 186.0331 - 23.2641 = 162.7690
2012 186.0388 - 23.2483 = 162.7905
2013 186.0577 - 23.2634 = 162.7943
2014 186.0882 - 23.2699 = 162.8183
2015 186.1036 - 23.2494 = 162.8542
2016 186.1392 - 23.2089 = 162.9303
2017 186.1627 - 23.1869 = 162.9758
2018 186.2138 - 23.1768 = 163.0370
2019 186.2542 - 23.1582 = 163.0960
2020 186.3039 - 23.1502 = 163.1537

Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation

It looks like despite less solar power, there is more sun actually reaching the surface, and less sun reflected back to space. This extra ~0.72 W/m2 seems to cover all the surface warming completely.

I’m starting to see a complete lack of any role of CO2 in warming over the last 20 years.

But that’s OK. I’m a morally superior person, and I will still say that CO2 and humans did it … somehow. No evidence will change my mind because I already know better. It’s just the right thing to do.

— Satire Over —

So, yeah, any serious person going to argue that humans reduced the clouds and made the surface more absorptive?

The facts don’t support the climate change cult.

Any minor thing humans may have done in the last 2 decades is completely washed out by nature.

The warming was completely induced on the solar shortwave side of the equation.

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2022/06/09
# File: ceres.sh
# Run: . ceres.sh; require; download; show

require() { sudo apt-get install -y nco; }
download() {
    echo go to: https://asdc.larc.nasa.gov/data/CERES/EBAF/Edition4.1/
    echo download the only file there. rename to: ceres.nc
}
one() {
    ncks -HC --trd -v g$1_mon ceres.nc | awk -F '[= ]' '{print $2" "$4
    }' | awk 'NR>10&&$2{print $1-L" "$2-$3; L=$1}' | sed 1d | awk 'NR!=1&&NR%12==1{
        printf "%d %.4f\n", 2001+Y++, S/D;
    } NR%12!=1 { S+=$2*$1; D+=$1; }'
}
pair() {
    one $1_down_$2 > .$1d; one $1_up_$2 > .$1u
    paste .$1d .$1u | awk '{printf "%d %.4f - %.4f = %.4f\n",$1,$2,$4,$2-$4}'
}
show() {
    echo "Solar"; one solar
    echo "TOA"; one toa_lw_all
    echo "SW"; pair sfc_sw all
    echo "LW"; pair sfc_lw all
    echo "Source: https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1/citation"
}

Real Steel Greenhouse Effect

13 years ago, amateur scientist Willis Eschenbach developed a thought experiment that he hoped would very simply illustrate how the Greenhouse Effect works.

The main claim is that the addition of a steel shell surrounding a planetary surface will cause the inner surface to emit TWICE (235 -> 470 W/m2) the radiation as compared to not having steel shell. This should significantly raise the surface temperature from ~254K to ~302K.

Is this true? No.

Willis gives us the freedom to construct any power source with any chosen radius. I will choose a mini nuclear reactor wrapped in a steel housing, with the total surface area being 1 m2. The inner radius of steel housing is 75% of total radius.

A = No Steel Shell, B = With Steel Shell

Let us go through the equations to make set up Willis’ initial scenario (A):

The nuclear power reactor is ONLY capable of making its wall 254.041K – to meet Willis’ initial criteria. It is not capable of anything greater, because nuclear reactions are fixed. No varying levels of downstream radiation will enable nuclear fission reactions to generate more joules.

Now let us see what happens when we add a steel shell (B):

I will give Willis credit for doing a good job of demonstrating the real greenhouse effect:

Outgoing radiation is halfed and T2 (our “surface”) has increased from 253.726K to 253.884K, a very feeble gain.

The problem with Willis’ approach is that he doesn’t reduce outgoing radiation and relies on his heat source to crank up … when there is no physical way it can do so.

Summary:

T#Willis AWillis BReality AReality B
T2253.726K (235W)301.732K (470W)253.726K (235W)253.884K (235.588 W)
T3253.726K (235W)213.490K (117.393 W)
Summary Table (W=W/m2)

So there you have it. The real steel greenhouse effect managed to raise the surface temperature by 0.062%.

Subsequent additions of steel shells will keep raising the surface temperature (T2), in an inverse asymptotic fashion approaching nuclear reactor wall temperature (T1).

Enjoy 🙂 -Zoe

Global Fires 2

In a previous post, Trend in Global Fires, I showed the global fire trend in the last 21 years. I found a source with more data, extending to 1982. It comes from a project funded by European Space Agency. Right here. Actual data is downloaded from UK servers, here.

Global Area Burned

1994 is missing in their data, but that’s alright. It’s obvious that carbon dioxide has zero effect on fires. Anyone who tells you otherwise is a liar, an imbecile, or just plain ignorant. The latter can be cured.

Short post. Enjoy 🙂 -Zoe

Update

I just realize this data is also plotted at ESA’s site (they sponsored this data):

Source

You see what they did there?

Stretched out the chart so you can barely notice any trend. Biased much?

Download My Data

You can download monthly data at https://pastebin.com/raw/3eVDrgLb

Code

# Zoe Phin, 2022/06/05
# File: firea.sh
# Run: . firea.sh; require; download; index; plot
# Output: fire.csv, firearea.png

require() { sudo apt-get install -y nco gnuplot; }
download() {
    for y in {1982..1993} {1995..2018}; do for m in {01..12}; do
        echo "wget -cO $y$m.nc https://dap.ceda.ac.uk/neodc/esacci/fire/data/burned_area/AVHRR-LTDR/grid/v1.1/$y/$y${m}01-ESACCI-L4_FIRE-BA-AVHRR-LTDR-fv1.1.nc"
    done;done > get.sh; bash get.sh
}
index() {
    for y in {1982..1993} {1995..2018}; do for m in {01..12}; do
        echo -n "$y $m "
        ncks -HC --trd -v burned_area $y$m.nc | awk -F '[= ]' '$8!=0 {
            S+=$8 } END { printf "%.3f\n",S/1e9 }' 
    done;done | tee fire.area
}
plot() { 
    awk '{ Y[$1]+=$3 } END {
        for (y in Y) { printf "%d %.3f\n",y,Y[y]/1000 }
    }' fire.area > fire.csv
    echo "set term png size 740,540; set nokey; 
    set title 'Global Area Burned (Million km²)'
    set xrange [1981:2019]; #set yrange [320:440]
    set mxtics 5
    set grid xtics ytics
    plot 'fire.csv' u 1:2 t 'Million m² Burned' w lines lw 2 lc rgb 'red'
    " | gnuplot > firearea.png 
}

USCRN Cooling

What is USCRN?

One of the principal conclusions of the 1997 Conference on the World Climate Research Programme was that the global capacity to observe the Earth’s climate system is inadequate and deteriorating worldwide and “without action to reverse this decline and develop the GCOS [Global Climate Observing System] , the ability to characterize climate change and variations over the next 25 years will be even less than during the past quarter century” (National Research Council [NRC] 1999). In spite of the United States being a leader in climate research, long term U.S. climate stations have faced challenges with instrument and site changes that impact the continuity of observations over time…

NOAA’s response to the NRC concerns is the USCRN, a network of stations deployed across the continental U.S.

Why CRN is Needed

Where is USCRN data? here.

And what has been the trend at USCRN stations over the last 15 years? (4/2007 – 4/2022)

Air Temperature
$ . crn.sh; crn; plot
Linear Regression Trend: -0.027934 °C/yr

That would be a total cooling of 15 years * -0.027934 °C/year =~ 0.42 °C averaged across all 98 stations that have consistent data going back to 4/2007.

Breakdown by station, in units: °C/year

1                   AK_Fairbanks_11_NE +0.04617
2                        AK_Sitka_1_NE +0.01976
3   AK_Utqiagvik_formerly_Barrow_4_ENE +0.18223
4                      AL_Clanton_2_NE -0.01507
5                   AL_Courtland_2_WSW -0.05665
6                     AL_Cullman_3_ENE -0.03464
7                     AL_Fairhope_3_NE +0.00464
8                      AL_Gadsden_19_N -0.04251
9                  AL_Gainesville_2_NE -0.02576
10                 AL_Greensboro_2_WNW -0.04768
11                  AL_Scottsboro_2_NE -0.01470
12                     AL_Selma_13_WNW -0.03024
13                      AL_Selma_6_SSE -0.03856
14                AL_Valley_Head_1_SSW -0.04865
15                 AR_Batesville_8_WNW -0.05950
16                        AZ_Elgin_5_S -0.01692
17                      AZ_Tucson_11_W -0.01442
18                    CA_Merced_23_WSW +0.03619
19                   CA_Redding_12_WNW +0.03952
20             CA_Stovepipe_Wells_1_SW -0.03849
21                     CO_Boulder_14_W -0.02105
22                      CO_Cortez_8_SE +0.01141
23                     CO_Dinosaur_2_E +0.04307
24                  CO_La_Junta_17_WSW -0.05558
25                  CO_Montrose_11_ENE +0.00016
26                       CO_Nunn_7_NNE -0.06019
27             FL_Everglades_City_5_NE +0.07494
28                   FL_Titusville_7_E +0.07542
29                     GA_Newton_11_SW +0.01076
30                       GA_Newton_8_W -0.01395
31               GA_Watkinsville_5_SSE -0.01863
32                  HI_Mauna_Loa_5_NNE +0.02550
33                       ID_Arco_17_SW -0.04111
34                      ID_Murphy_10_W -0.02431
35                   IL_Champaign_9_SW +0.00057
36                   IL_Shabbona_5_NNE -0.05673
37                  KS_Manhattan_6_SSW -0.10209
38                    KS_Oakley_19_SSW -0.13040
39             KY_Bowling_Green_21_NNE -0.06931
40                 KY_Versailles_3_NNW -0.06288
41                  LA_Lafayette_13_SE -0.01139
42                      LA_Monroe_26_N -0.02333
43                  ME_Limestone_4_NNW -0.10207
44                     ME_Old_Town_2_W -0.05897
45                     MI_Chatham_1_SE -0.10539
46                 MN_Goodridge_12_NNW -0.09950
47               MO_Chillicothe_22_ENE -0.04920
48                     MS_Newton_5_ENE -0.06509
49                   MT_St._Mary_1_SSW -0.06547
50                MT_Wolf_Point_29_ENE -0.00727
51                 MT_Wolf_Point_34_NE -0.08048
52                   NC_Asheville_13_S -0.01759
53                  NC_Asheville_8_SSW -0.01700
54                      NC_Durham_11_W -0.04043
55                       ND_Medora_7_E -0.07155
56                  ND_Northgate_5_ESE -0.15485
57                  NE_Harrison_20_SSE -0.04485
58                    NE_Lincoln_11_SW -0.05038
59                    NE_Lincoln_8_ENE -0.01464
60                    NE_Whitman_5_ENE -0.04729
61                       NH_Durham_2_N -0.05508
62                     NH_Durham_2_SSW -0.04165
63                  NM_Las_Cruces_20_N -0.00039
64                  NM_Los_Alamos_13_W +0.04407
65                     NM_Socorro_20_N -0.03662
66                        NV_Baker_5_W -0.00906
67                    NV_Mercury_3_SSW -0.03112
68                      NY_Ithaca_13_E +0.00221
69                    NY_Millbrook_3_W -0.06797
70                     OK_Goodwell_2_E +0.00749
71                   OK_Stillwater_2_W -0.06988
72                 OK_Stillwater_5_WNW -0.02301
73                       ON_Egbert_1_W -0.07154
74                 OR_Corvallis_10_SSW +0.02164
75                  OR_John_Day_35_WNW -0.01361
76                     OR_Riley_10_WSW -0.00229
77                    RI_Kingston_1_NW -0.03927
78                     RI_Kingston_1_W -0.03514
79                   SC_Blackville_3_W +0.00990
80              SC_McClellanville_7_NE +0.01364
81                   SD_Buffalo_13_ESE -0.03807
82                      SD_Pierre_24_S -0.12013
83               SD_Sioux_Falls_14_NNE -0.06385
84                  TN_Crossville_7_NW -0.04031
85                    TX_Bronte_11_NNE -0.07873
86                  TX_Edinburg_17_NNE +0.00096
87                   TX_Monahans_6_ENE -0.02487
88                    TX_Muleshoe_19_S -0.04374
89                  TX_Palestine_6_WNW -0.07924
90             TX_Panther_Junction_2_N -0.02188
91               VA_Cape_Charles_5_ENE -0.04508
92            VA_Charlottesville_2_SSE -0.03805
93                WA_Darrington_21_NNE +0.04027
94                    WA_Quinault_4_NE +0.05220
95                    WI_Necedah_5_WNW -0.07419
96                    WV_Elkins_21_ENE -0.03204
97                    WY_Lander_11_SSE -0.07578
98                      WY_Moose_1_NNE -0.07829

Only 23 out of 98, that is, ~23.5% show a warming trend.

But there is more!

The nice thing about these stations is that they not only provide air temperature data, but most are also equipped with sensors to measure surface temperature. The results are even more dramatic at the 92 sites with persistent IR surface data:

Surface Temperature
$ . crn.sh; crn; plot
Linear Regression Trend: -0.084043 °C/yr

That would be a total cooling of 15 years * -0.084043 °C/year =~ 1.26 °C averaged across all 92 stations that have consistent surface data going back to 4/2007.

Breakdown by station, in units: °C/year

1                   AK_Fairbanks_11_NE -0.00860
2                        AK_Sitka_1_NE -0.07764
3                     AK_St._Paul_4_NE +0.19834
4   AK_Utqiagvik_formerly_Barrow_4_ENE +0.08741
5                     AL_Fairhope_3_NE -0.07363
6                      AL_Gadsden_19_N +0.03686
7                      AL_Selma_13_WNW +0.09666
8                  AR_Batesville_8_WNW -0.12344
9                         AZ_Elgin_5_S -0.15520
10                      AZ_Tucson_11_W -0.13846
11                    CA_Merced_23_WSW +0.01445
12                   CA_Redding_12_WNW +0.00997
13             CA_Stovepipe_Wells_1_SW -0.15013
14                     CO_Boulder_14_W -0.10934
15                      CO_Cortez_8_SE -0.10363
16                     CO_Dinosaur_2_E -0.09103
17                  CO_La_Junta_17_WSW -0.11102
18                  CO_Montrose_11_ENE -0.16274
19                       CO_Nunn_7_NNE -0.09077
20             FL_Everglades_City_5_NE -0.11284
21                   FL_Titusville_7_E -0.00759
22                     GA_Newton_11_SW -0.00052
23                       GA_Newton_8_W -0.07369
24               GA_Watkinsville_5_SSE -0.12636
25                  HI_Mauna_Loa_5_NNE -0.02257
26                       ID_Arco_17_SW -0.20966
27                      ID_Murphy_10_W -0.16679
28                   IL_Champaign_9_SW +0.03643
29                   IL_Shabbona_5_NNE -0.13466
30                  KS_Manhattan_6_SSW -0.12101
31                    KS_Oakley_19_SSW -0.18244
32             KY_Bowling_Green_21_NNE -0.12606
33                 KY_Versailles_3_NNW -0.06785
34                  LA_Lafayette_13_SE -0.00229
35                      LA_Monroe_26_N -0.13776
36                  ME_Limestone_4_NNW -0.13709
37                     ME_Old_Town_2_W -0.11059
38                     MI_Chatham_1_SE -0.16315
39                 MN_Goodridge_12_NNW -0.10715
40               MO_Chillicothe_22_ENE -0.07844
41                     MS_Newton_5_ENE -0.07688
42                   MT_St._Mary_1_SSW -0.15023
43                MT_Wolf_Point_29_ENE -0.05258
44                 MT_Wolf_Point_34_NE -0.14085
45                   NC_Asheville_13_S -0.12034
46                  NC_Asheville_8_SSW -0.17203
47                      NC_Durham_11_W -0.16045
48                       ND_Medora_7_E -0.05318
49                  ND_Northgate_5_ESE -0.12388
50                  NE_Harrison_20_SSE -0.02705
51                    NE_Lincoln_11_SW -0.09265
52                    NE_Lincoln_8_ENE -0.04629
53                    NE_Whitman_5_ENE -0.10772
54                       NH_Durham_2_N -0.03671
55                     NH_Durham_2_SSW -0.03245
56                  NM_Las_Cruces_20_N -0.06474
57                  NM_Los_Alamos_13_W -0.10742
58                     NM_Socorro_20_N -0.21379
59                        NV_Baker_5_W -0.00542
60                    NV_Mercury_3_SSW -0.06139
61                      NY_Ithaca_13_E +0.00060
62                    NY_Millbrook_3_W -0.09216
63                     OK_Goodwell_2_E -0.06627
64                   OK_Stillwater_2_W -0.15102
65                 OK_Stillwater_5_WNW -0.19385
66                       ON_Egbert_1_W -0.15461
67                 OR_Corvallis_10_SSW +0.02328
68                  OR_John_Day_35_WNW +0.00458
69                     OR_Riley_10_WSW -0.06991
70                     PA_Avondale_2_N -0.08590
71                    RI_Kingston_1_NW -0.03059
72                     RI_Kingston_1_W -0.08983
73                   SC_Blackville_3_W -0.07645
74              SC_McClellanville_7_NE -0.06866
75                   SD_Buffalo_13_ESE -0.10984
76                      SD_Pierre_24_S -0.15606
77               SD_Sioux_Falls_14_NNE -0.11285
78                  TN_Crossville_7_NW -0.11332
79                    TX_Bronte_11_NNE -0.14726
80                  TX_Edinburg_17_NNE -0.03847
81                   TX_Monahans_6_ENE -0.06433
82                    TX_Muleshoe_19_S -0.05690
83                  TX_Palestine_6_WNW -0.00939
84             TX_Panther_Junction_2_N +0.00723
85               VA_Cape_Charles_5_ENE -0.02184
86            VA_Charlottesville_2_SSE -0.07100
87                WA_Darrington_21_NNE -0.08743
88                    WA_Quinault_4_NE -0.19209
89                    WI_Necedah_5_WNW -0.24823
90                    WV_Elkins_21_ENE -0.06492
91                    WY_Lander_11_SSE -0.19794
92                      WY_Moose_1_NNE -0.20405

Only 11 out of 92, that is, ~12% show a warming trend.

Now this is important: I’m not saying that the US has cooled over the last 15 years. All I’m saying is that the newest and best stations suggest this. You decide the significance of this.

Enjoy 🙂 -Zoe

Update 06/07:

If we break up CRN data by years, this is what we get:

$ cat years.csv

0 12.012
1 11.5878
2 11.3553
3 11.889
4 12.9113
5 12.0568
6 11.4294
7 11.9557
8 13.0134
9 13.102
10 12.0355
11 12.2281
12 12.4914
13 12.4444
14 12.541

$ tail -8 year.csv | gmt gmtregress -Fp -o5

-0.00460833333333

Year 0 is 2007/5 to 2008/4 (inclusive), and Year 14 is 2021/5 to 2022/4 (inclusive)

The last 8 years show a definite COOLING trend for air temperature.

$ cat years.csv

0 12.564
1 12.0218
2 11.8927
3 12.6139
4 13.5311
5 12.7203
6 11.7274
7 12.2206
8 13.1326
9 13.2569
10 12.3038
11 12.3714
12 12.4584
13 12.7154
14 12.6693

$ tail -12 years.csv | gmt gmtregress -Fp -o5

-0.0173821678322

The last 12 years show a definite COOLING trend for surface temperature.

Code:

# Zoe Phin, 2022/06/04
# File: crn.sh
# Run: . crn.sh; require; download; crn; plot; sta

require() { sudo apt-get install -y gmt gnuplot; }
download() {
    wget -O crn.zip https://www.ncei.noaa.gov/pub/data/uscrn/products/monthly01/snapshots/CRNM0102202205300730.zip
    unzip crn.zip
}
crn() {
    for sta in $(ls -1 CRN*); do
        [[ $(egrep '200704|202204' $sta | awk '{print $9}' | grep -v 9999 | wc -l) -lt 2 ]] && continue
        tmp=${sta##CRNM0102-}; echo -n "${tmp%%.txt} "
        awk '$2>=200704 && $2<=202204 { printf "%.2f ",$9 }' $sta 
        echo
    done | awk '{ 
        for (m=2;m<=NF;m++) { if ($m!="-9999.00") { T[m-2]+=$m; N[m-2]+=1 } }
    } END { 
        for (m in T) print 2007+4/12+m/12-1/24" "T[m]/N[m] 
    }' > crn.csv
}
plot() { 
    cat crn.csv | gmt gmtregress > plot.mo
    echo "set term png size 740,470
        set key out top horizontal
        set yrange[-4:28]; set xrange[2007:2022.8]
        set ylabel 'Surface Temperature (°C)'
        set xtics 5; set mxtics 5; set mytics 5
        set grid xtics mxtics ytics
        plot 'plot.mo' u 1:2 w lines t 'Monthly' lw 1 lc rgb 'black',\\
             'plot.mo' u 1:3 w lines t 'Trend'   lw 2 lc rgb 'blue'
    " | gnuplot > crn.png
    echo -n "Linear Regression Trend: "
    cat crn.csv | gmt gmtregress -Fp -o5 | awk '{printf "%.6f °C/yr\n",$1}'
}
sta() {
    let n=0
    for sta in $(ls -1 CRN*); do
        [[ $(egrep '200704|202204' $sta | awk '{print $9}' | grep -v 9999 | wc -l) -lt 2 ]] && continue
        tmp=${sta##CRNM0102-}; echo -n "$((++n)) ${tmp%%.txt} "
        awk '$2>=200704 && $2<=202204 && $9!~/9999/ { y=substr($2,0,4); m=substr($2,5,2);
            printf "%4.2f %4s\n",y+m/12-1/24,$9 
        }' $sta | gmt gmtregress -Fp -o5
    done | awk '{
        printf "%-3d %34s %+8.5f\n", $1, $2, $3
    }' | tee sta.csv
    echo -n "Avg: "
    cat sta.csv | awk '{S+=$3}END{print S/NR}'	
}

How to Underestimate Geothermal

Many people believe that because the geothermal heat flux is ~91 milliwatts/m², this would imply that, without the sun, the Earth’s surface would be ~36 Kelvin. This is calculated the following way:

The math is correct, but there are two problems. First, this is for the top-of-the-atmosphere, not for the surface. But I will not address this today. The second problem, I will address today, is that this flux is wrong for this type of calculation.

Imagine an internally heated object is radiating to space. It’s very warm on the inside, but it very rapidly cools as you approach the exterior surface. Then a much hotter object joins in and starts to heat the exterior surface. What happens? The surface is now much warmer then it would be, and the flux going from inside to outside is … reduced!

The famous [small] geothermal heat flux is already a product of this reduction! The question now becomes … what would the flux be without the sun?

This question is extremely difficult to answer given all the variable parameters, and as far as I can tell … NO ONE has attempted it. But I’ll give it a small shot 🙂

I made a very simplistic 1-dimensional heat transfer program (code below) to help me out. First let me tell you its shortcomings:

No Albedo. No emissivity. No seasons. One latitude. No accounting of variation of specific heat or k-value as a function of depth. m*Cp = 222 and k=1, for all depths.

The sun simulation: Sun rises at 6am and goes to a full 1360 W/m² (no albedo) at high noon and disappears again at 6pm (like equator on equinox). This is repeated for 10 years (3650 days).

Despite these limitations, my central point is correct and will become obvious.

Not knowing where to start, I chose 273.15 K (0 °C) as the inherent geothermal temperature at 200m depth. The result becomes:

T @ depth of 200m = 273.15 K
$ cat plot.csv

  0 253.63
  2 253.82
  4 254.02 ...
196 272.76
198 272.95
200 273.15

The geothermal flux here is:

Geothermal Heat Flux if T @ depth 200m = 273.15K

That is very close to the measured geothermal flux. Now let’s see what happens when we turn off the solar influence:

No Sun

This result can also be achieved via a simple equation:

The geothermal heat flux now is 0.9462 W/m². That is … nearly 10 TIMES the flux when the sun is included. 9.695x to be more exact.

So the actual geothermal flux without the sun is very different than the flux with the sun. And this is not surprising.

Before my critics complain that I arbitrarily kept the temperature at 200m depth constant, they should be reminded that this doesn’t matter to my central point. The top (surface) cools much faster than the interior, and that this immediately raises the flux. Either way, the famous geothermal heat flux is irrelevant for no-sun calculations as the REAL “steady state” tendency [without the sun] is to cool from top to bottom to the external temperature from distant space objects (2.725K). Given infinite time … the heat flux goes to ZERO, and the surface temperature is obviously driven much colder than the 35.65 K figure calculated simply from today’s current geothermal heat flux with solar influence.

The 91 milliWatts/m² geothermal heat flux is worthless for no-sun predictions of surface termperature. QED

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2022/05/06
# File: hsim.sh
# Run: source hsim.sh; heat; plot

heat() {
    awk 'func show(s,t,T,x) {
            if (x=="AVG") printf "%*s AVG   ",12," "
            else printf "d%03d %02d:%02d |%04dW| ",t/1440+1,t%1440/60,t%60,s
            for (i=0;i<=L/2;i++) { printf "%06.2f ",T[i]; A[i]+=T[i]/24 }; print ""
        }
        func reset(s,t,T) {
            show(s,t-1440,A,"AVG")
            for (i=0;i<=L/2;i++) { A[i]=0 }
        } 
        BEGIN { C=222; d=3650; L=200 
        for (t=0;t<d*1440;t++) {
            T[L/2] = 273.15	# Geothermal Base Temperature
            if ((S=1360*cos(atan2(0,-1)*(t-720)/720))<0) S=0
            if (t/1440>0&&t%1440==0) reset(S,0,A)
            if (t%60==0) show(S,t,T,"")	
            for (i=0;i<L/2;i++) {
                D = (T[i]-T[i+1])/2
                T[i]-=D/C; T[i+1]+=D/C 
            }
            T[0] += (S-5.67e-8*T[0]^4)/C
        }
        reset(S,0,A)
    }' > data
}

plot() {
    tail -1 data | tr -s ' ' '\n' | sed 1,2d | awk '{print (NR-1)*2" "$1}' > plot.csv
    echo "set term png size 740,540; set nokey
        set title 'Depth (m) vs Temperature(K)'
        set yrange [200:0]
        set xrange [240:290]
        set ytics 20; set mytics 4
        set xtics 5; set mxtics 5
        set grid ytics xtics
        plot 'plot.csv' u 2:1 w lines lw 2 lc rgb 'red'
    " | gnuplot > plot.png
}

Ivermectin works

About a week ago I read a whole slew of mainstream media articles claiming that a new study completely debunks Ivermectin’s use in treating COVID19.

This is completely false.

Here’s a great takedown of that propaganda:

Whoops! The TOGETHER Trial actually showed that ivermectin worked.

And here’s a meta analysis of all the studies:

IVMMeta

We’re living in a mad mad world, and the demons are not being caged.

– Z

Global Sea Ice Area 2

I found two more sources of sea ice concentration data. One is the Hadley Centre, in the UK (Description and Data), and the other is NOAA (Description and Data [Used Monthly]).

Here’s what they look like:

Linear Regression Trend: From 0.03789 To 0.03738 is -1.336%
Linear Regression Trend: From 0.03891 To 0.04328 is +11.239%

What to make of such a large difference? I’m not sure. I’ll have to skip this question for now.

Some may say that the Hadley data contradicts my last post, but I’m not too concerned about this because we all know carbon dioxide is not causing sea ice to melt. How do we know this?

Well here’s the same Hadley data with the last 3 years removed:

Linear Regression Trend: From 0.03752 To 0.03815 is +1.664%

Will any climate alarmist seriously argue that carbon dioxide can only melt ice in the last few years but not from 1980 to 2017?

That’s all. Enjoy 🙂 -Zoe

Code hice.sh

# Zoe Phin, 2021/12/04
# File: hice.sh
# Run: . hice.sh; require; download; index; plot
require() { sudo apt-get install -y gmt gnuplot python3-matplotlib python3-xarray; }
download() { wget -O- -c 'https://www.metoffice.gov.uk/hadobs/hadisst/data/HadISST_ice.nc.gz' | zcat > hice.nc; }
index() { echo "import xarray as x; import xarray.ufuncs as u; import matplotlib.pyplot as p
    ice = x.open_dataset('hice.nc').sic
    for m in (ice*u.cos(u.deg2rad(ice.latitude))).mean(['latitude','longitude']).values:
        print(m)" | sed 's/^\t//' | python | awk '{printf "%.2f %7.4f\n", 1870+NR/12-1/24, $1}' > hice.csv
}
plot() { 
    awk '$1>=1980 && $1<=2017 {print}' hice.csv | gmt gmtregress | awk 'NR>1{print $1" "$2" "$3}' > hice.dat
    sed -n '1p;$p' hice.dat | awk 'NR==1{S=$3} NR==2{E=$3} END { printf "\
    Trend: From %.5f To %.5f is %+.3f%\n", S, E, (E/S-1)*100 }'
    echo "set term png size 740,470; unset key; set title 'Global Sea Ice (Hadley)'
    set grid ytics xtics; set mxtics 5; set mytics 5; set ytics format '%.3f'
    set xrange [1979:2021]; plot 'hice.dat' u 1:2 w l lw 1 lc rgb '#0000EE',\
         'hice.dat' u 1:3 w l lw 2 lc rgb '#000088" | gnuplot > hice.png
}

Code icec.sh

# Zoe Phin, 2021/12/04
# File: icec.sh
# Run: . icec.sh; require; download; index; plot
require() { sudo apt-get install -y gmt gnuplot nco; }
download() { wget -O icec.nc -c 'ftp://ftp2.psl.noaa.gov/Datasets/noaa.oisst.v2/icec.mnmean.nc'; }
index() {
    for t in {0..479}; do
        ncks -HC --trd -d time,$t,$t -v icec icec.nc | awk -vt=$t -F '[= ]' '{ 
        if ($8 == 32767) V=0; else V=$8/100+327.65
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
        A=(a*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2
        SA+=A; S+=V*A; } END { printf "%.2f %.5f\n", 1982-1/24+t/12, S/SA/100 }'
    done | tee icec.csv
}
plot() { 
    awk '$1>=1982 && $1<=2020 {print}' icec.csv | gmt gmtregress | awk 'NR>1{print $1" "$2" "$3}' > icec.dat
    sed -n '1p;$p' icec.dat | awk 'NR==1{S=$3} NR==2{E=$3} END { printf "\
    Trend: From %.5f To %.5f is %+.3f%\n", S, E, (E/S-1)*100 }'
    echo "set term png size 740,470; unset key; set title 'Global Sea Ice (NOAA)'
    set grid ytics xtics; set mxtics 5; set mytics 5; set ytics format '%.3f'
    set xrange [1979:2021]; plot 'icec.dat' u 1:2 w l lw 1 lc rgb '#0000EE',\
         'icec.dat' u 1:3 w l lw 2 lc rgb '#000088" | gnuplot > icec.png
}

Global Sea Ice Area

According to many sources (including National Snow and Ice Data Center), global sea ice has been drastically decreasing for a long time. Today I will show you a very legitimate source that will have you question this fact. The data comes from NASA, specifically here, or here. The relevant data variable is called FRSEAICE.

I analyzed monthly data over exactly 39 years, from 1982/10 to 2021/09. Here is my result:

Linear Regression Trend: From 0.03625 To 0.03631 is +0.171%

The “Sea Ice Area Fraction” is a proportion of the entire Earth’s surface that is ice over water. As you can see, about 3.6% (on average) of our planet’s area is covered in ice over water. In the last 39 years, ice over water has INCREASED, and not decreased, as popularly claimed.

The observed increase of 0.00006 is equivalent to ~30,600 km², roughly the size of Belgium.

Now let’s break it down by hemisphere:

North Hemisphere
Linear Regression Trend: From 0.04039 To 0.03405 is -15.707%
South Hemisphere
Linear Regression Trend: From 0.03214 To 0.03855 is +19.947%

The large loss in sea ice in the northern hemisphere is more than made up for in a larger gain in sea ice in the southern hemisphere.

That’s all. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/11/27
# File: ice.sh
# Run: . ice.sh; require; download; ice; plot
# Data: https://goldsmr4.gesdisc.eosdis.nasa.gov/opendap/MERRA2_MONTHLY/M2TMNXFLX.5.12.4/contents.html

require() { sudo apt-get install -y gmt gnuplot; }
download() { user=username; pass=password
    let n=1; for y in {1982..2021}; do for m in {01..12}; do
        [ $y -eq 1992 ] && n=2; [ $y -eq 2001 ] && n=3; [ $y -eq 2011 ] && n=4;  
        url="https://goldsmr4.gesdisc.eosdis.nasa.gov/opendap/MERRA2_MONTHLY/M2TMNXFLX.5.12.4/$y/MERRA2_${n}00.tavgM_2d_flx_Nx.$y$m.nc4.nc"
        wget -O M$y$m.nc --user=$user --password=$pass -c $url
    done;done
}
one() {
    ncks -HC --trd -v $1 M$2$3.nc | awk -F '[= ]' -vy=$2 -vm=$3 '{ 
        n=2; if ($4 == "-90" || $4 == "90") { n=4; $4=89.875 }
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
        A=1.25*(a/n*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2
        SA+=A; S+=$8*A 
    } END { printf "%.2f %.5f\n", y+m/12-1/24, S/SA }'
}
ice() { for y in {1982..2021}; do for m in {01..12}; do one FRSEAICE $y $m; done; done | tee ice.csv; }
plot() { 
    cat ice.csv | awk '$1>1982.8 {print}' | gmt gmtregress | awk 'NR>1{print $1" "$2" "$3}' > ice.dat

    sed -n '1p;$p' ice.dat | awk 'NR==1{S=$3} NR==2{E=$3} END { printf "\
    Trend: From %.5f To %.5f is %+.3f%\n", S, E, (E/S-1)*100 }'

    echo "set term png size 740,470; unset key; set title 'Global Sea Ice Area Fraction'
    set grid ytics xtics; set mxtics 5; set mytics 5; set ytics format '%.3f'
    set xrange [1982.5:2022.5]; 
    plot 'ice.dat' u 1:2 w l lw 1 lc rgb '#0000EE',\
         'ice.dat' u 1:3 w l lw 2 lc rgb '#000088" | gnuplot > ice.png
}

Note: This data requires user registration here. Replace username and password in the code.

Real Global Snowfall Trend

In a previous post, I tried measuring the global snowfall trend over the last 41 years using a pixel color technique because I couldn’t find the original data behind NASA’s public images. I have now found the higher resolution data needed to find the most accurate global snowfall trend.

The data is from here1. The results are very similar to the previous result, thus the pixel color technique was excellent.

Here is the real global snowfall trend:

Linear Regression Trend: From 2.855 To 2.949 is +3.292%

Global snowfall has increased by over 3 percent in the last four decades.

By hemisphere:

North Hemisphere
Linear Regression Trend: From 2.771 To 2.538 is -8.415%
South Hemisphere
Linear Regression Trend: From 2.939 To 3.358 is +14.232%

That’s all. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/11/25
# File: snow.sh
# Run: . snow.sh; require; download; snow; plot
# Data: https://goldsmr4.gesdisc.eosdis.nasa.gov/opendap/MERRA2_MONTHLY/M2TMNXFLX.5.12.4/contents.html

require() { sudo apt-get install -y gmt gnuplot; }
download() { user=username; pass=password
    let n=1; for y in {1980..2020}; do for m in {01..12}; do
        [ $y -eq 1992 ] && n=2; [ $y -eq 2001 ] && n=3; [ $y -eq 2011 ] && n=4;  
        url="https://goldsmr4.gesdisc.eosdis.nasa.gov/opendap/MERRA2_MONTHLY/M2TMNXFLX.5.12.4/$y/MERRA2_${n}00.tavgM_2d_flx_Nx.$y$m.nc4.nc"
        wget -O M$y$m.nc --user=$user --password=$pass -c $url
    done;done
}
one() {
    ncks -HC --trd -v $1 M$2$3.nc | awk -F '[= ]' -vy=$2 -vm=$3 '{ 
        n=2; if ($4 == "-90" || $4 == "90") { n=4; $4=89.875 }
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
        A=1.25*(a/n*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2
        SA+=A; S+=$8*A 
    } END { printf "%.3f %.3f\n", y+m/12-1/24, 1e6*S/SA }'
}
snow() { for y in {1980..2020}; do for m in {01..12}; do one PRECSNO $y $m; done; done | tee snow.csv; }
plot() { 
    cat snow.csv | gmt gmtregress | awk 'NR>1{print $1" "$2" "$3}' > snow.dat

    sed -n '1p;$p' snow.dat | awk 'NR==1{S=$3} NR==2{E=$3} END { printf "\
    Trend: From %.3f To %.3f is +%.3f%\n", S, E, (E/S-1)*100 }'

    echo "set term png size 740,470; unset key; set title 'Global Snowfall (mg/m^2/s)'
    set grid ytics xtics; set mxtics 5; set mytics 5; set ytics format '%.1f'
    set xrange [1979.5:2021.5]
    plot 'snow.dat' u 1:2 w l lw 1 lc rgb '#0000EE',\
         'snow.dat' u 1:3 w l lw 2 lc rgb '#000088" | gnuplot > snow.png
}

Note 1: This data requires user registration here. Replace username and password in the code.

YouTube removed 3.9 million dislikes for White House

81m.org has been keeping track of White House videos since early February. Today I reveal the running count of dislikes removed by YouTube since the inception of 81m.org.

$ . 81m.sh; removals

Total: -3901479

3.9 Million dislikes removed!

The entire list of removals by video is archived here.

The official totals are:

$ . 81m.sh; totals

Views: 26645997
Likes: 378138
Dislikes: 2048757

Nearly twice as many dislikes were removed as remain. Disgusting!

Here’s another interesting fact:

81m.org tracked 655 videos released by the White House. Only 427 videos are listed on White House’s YouTube Channel. They removed 228 videos!

So odd for the most popular resident ever!

Take care, -Zoe

BTW, this does not include the 130,000 dislikes removed Jan 26 to 27, as discovered here.

Code

# Zoe Phin, 2021/06/21
# File: 81m.sh
# Run: source 81m.sh; all
# Output: removed.txt

vidlist() {
    u="https://81m.org"
    last=$(wget -qO- $u | egrep -o '/[0-9]+' | tail -1 | tr -cd 0123456789)
    for i in $(seq $last); do
        wget -qO- "$u/$i/" | awk -F\' '
            $2~/os\//{print "https://81m.org"$2"/data.tsv"}'
    done | tee vidlist.csv
}

archive() {
    mkdir -p data
    for i in $(cat vidlist.csv); do
        wget -O data/${i:23:11}.csv $i
    done
}

removals() {
    for f in $(ls data); do
        awk -vv=${f:0:11} '{D=$3-d;if(D<0)S+=D;d=$3}END{print v" "S}' data/$f
    done | awk '{print "https://www.youtube.com/watch?v="$1" "$2; S+=$2}END{
        print "\nTotal: "S}' | tee removed.txt
}

totals() {
    for f in $(ls data); do
        tail -1 data/$f
        echo
    done | awk '{V+=$1;L+=$2;D+=$3} END {print "Views: "V"\nLikes: "L"\nDislikes: "D}'
}

all() { vidlist; archive; totals; removals; }

Please use this code only for research, and not to spam my anonymous friend who made that site. Thank you.

Global Hurricane Hours

Results Preview

Climate alarmists claim that hurricane frequency is increasing. I have already dealt with this for the Atlantic, here. There was no trend. Today, I will analyze global data.

NOAA sponsors the largest collection of historic hurricane data: IBTrACS (International Best Track Archive for Climate Stewardship). There is 13,545 storms archived going back to 1842, from a total of 14 agencies. My focus will only be on hurricanes, that is: storms that at some time achieved wind speeds at or above 64 knots. Also known as Category 1 or greater.

It is definitely true that the number of detected hurricanes has increased. This is due to better sensing technology, such as aboard: radiosondes (1930s), regular transoceanic air travel (1940s), satellites (late 1960s). Due to limitations of older data, it doesn’t make any sense to consider global data before 1950.

Aside from detection, there is also a matter of how one counts the frequency of hurricanes. Does it make sense to count a 6-hour Category 3 storm the same as a 42-hour Category 3 storm?

Both classified as Category 3, but Cat3 status unequal in time.

No, it’s doesn’t make sense. Such a thing would be misleading. But that is what climate alarmists do.

A better thing to do would be to count hours spend in certain wind speed categories:

And this is exactly what I did. Here are the results:

10yr CMA means 10-Year Centered Moving Average.

What does the best hurricane data in the world show?

Category 5 has decreased

Category 4 is cyclic/no-trend

Category 3 had increased, but dropping last 25 years

Category 2 had increased, but dropping last 25 years

Category 1 is cyclic/no-trend

Now we go to category hybrids:

Category 1 & 2
Categories 3,4 & 5

Yeah, pretty much cyclic/no trend.

Category 1,2,3,4,5 is the first image in this post. No trend!

Here are stats for top 10 years for each of our categories:

Cat 1 Hours
1972 :  5187
1992 :  4533
1997 :  4230
1996 :  4149
1990 :  4059
1971 :  4047
1964 :  3924
1968 :  3711
1989 :  3564
1986 :  3402
Cat 2 Hours
1992 :  2670
1997 :  2289
2015 :  2184
2004 :  2082
1972 :  2022
2005 :  1863
1994 :  1833
1996 :  1818
2018 :  1800
1991 :  1743
Cat 3 Hours
2015 :  1752
1997 :  1692
2018 :  1686
1992 :  1650
2004 :  1569
2005 :  1434
1972 :  1416
2019 :  1359
1994 :  1344
1991 :  1311
Cat 4 Hours
2004 :  1194
2015 :  1122
2018 :  1086
1997 :  1032
1992 :  1014
2005 :   978
1996 :   966
1994 :   909
2003 :   894
2014 :   885
Cat 5 Hours
1959 :   528
1997 :   462
1958 :   408
1957 :   330
2018 :   324
1961 :   312
1965 :   291
1954 :   267
2004 :   234
1962 :   228
Cat 1+2 Hours
1972 :  6201
1992 :  5739
1997 :  5412
1996 :  5031
1990 :  4842
1971 :  4767
1964 :  4599
2015 :  4542
1968 :  4359
1994 :  4320
Cat 3+4+5 Hours
1997 :  2466
2004 :  2379
1992 :  2295
2015 :  2268
2018 :  2187
2005 :  2007
1994 :  1866
1961 :  1842
1996 :  1797
2003 :  1710
Cat 1+2+3+4+5 Hours
1972 :  6606
1992 :  6489
1997 :  6123
2015 :  5799
1990 :  5613
1996 :  5544
1971 :  5424
1964 :  5373
2018 :  5346
2004 :  5307

Climate alarmists claim that greenhouse gases create more energy for hurricanes. Well, where is that extra energy for hurricanes?

Take care, -Zoe

Update

I added more categories.

Storm hours over 20 knots (23mph, 37 km/h)

There is a decrease in hours spend above 20 knots in the satellite era.

Storm hours over 5 knots (5.8 mph, 9.3 km/h)

Also a decrease in hours spend above 5 knots. This is not even a breeze.

So, I ask again: where is the extra storm energy from carbon dioxide?

Code

# Zoe Phin, 2021/06/16
# File: ibtracs.sh
# Run: source ibtracs.sh; require; download; hurrs; stats; plotall

require() { sudo apt-get install nco gnuplot; }

download() { wget -c "https://www.ncei.noaa.gov/data/international-best-track-archive-for-climate-stewardship-ibtracs/v04r00/access/netcdf/IBTrACS.ALL.v04r00.nc"; }

view="ncks IBTrACS.ALL.v04r00.nc --trd -HC -v" 

filter() {	# Speed Filter: $1 - Min, $2 - Max
    tr -d ' ' | tr _ 0 | awk -vm=$1 -vM=$2 -F '[]=[]' '$8>=m && $8<=M {print $2"."$4}'
}

winds() { # All data sources
    $view bom_wind        | filter $1 $2 > .w01
    $view cma_wind        | filter $1 $2 > .w02
    $view ds824_wind      | filter $1 $2 > .w03
    $view hko_wind        | filter $1 $2 > .w04
    $view mlc_wind        | filter $1 $2 > .w05
    $view nadi_wind       | filter $1 $2 > .w06
    $view neumann_wind    | filter $1 $2 > .w07
    $view newdelhi_wind   | filter $1 $2 > .w08
    $view reunion_wind    | filter $1 $2 > .w09
    $view td9635_wind     | filter $1 $2 > .w10
    $view tokyo_wind      | filter $1 $2 > .w11
    $view usa_wind        | filter $1 $2 > .w12
    $view wellington_wind | filter $1 $2 > .w13
    $view wmo_wind		  | filter $1 $2 > .w14
}

yhours() { 
    winds $1 $2
    sort -nu .w* | awk -F. '{print $1}' | uniq -c | awk '{print $2" "$1}' > storm.obs
    sed -f storm.year storm.obs | awk '{Y[$1]+=$2} END { for (y in Y) print y" "Y[y]*3}'
}

hurrs() {
    $view season | tr -d ' ' | awk -F '[]=[]' '{print "s/^"$4" / "$6" /"}' | sed \$d > storm.year
    yhours 137 999 > cat5.hours
    yhours 113 136 > cat4.hours
    yhours  96 112 > cat3.hours
    yhours  83  95 > cat2.hours
    yhours  64  82 > cat1.hours

    yhours  64  96 > cat12.hours
    yhours  96 999 > cat345.hours
    yhours  64 999 > cat12345.hours
}

stats() {
    for c in 1 2 3 4 5 12 345 12345 ; do
        f="cat$c.hours"
        sort -rnk2.1 $f | awk -vc=$c 'BEGIN{ 
            print "Cat "c" Hours" } 
    NR<11 { printf "%s : %5s\n",$1,$2 }' 
    done
}

cma() {
    cut -c6- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++) s+=$j/(p+1)
        printf "%.4f\n", s
    }}'
}

plot() {
    awk '$1>=1950{print $0}' $1 | tee .dat | cma 10 > .cma
    paste -d ' ' .dat .cma > .plt
    c=$(echo -n $1 | tr -cd 12345)
    echo "set term png size 740,480; set mytics 5
    set key outside top left horizontal
    set grid; set xrange [1950:2020]
    plot '.plt' u 1:2 t 'Category $c Hours' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > c$c.png
}

plotall() { for c in 1 2 3 4 5 12 345 12345 ; do plot "cat$c.hours"; done; }

On Albedo

Albedo is a measure of reflected incoming radiation. It’s extremely vital in climate science, and so it’s important to know what it actually is, and what its trend was.

I was taught in school that albedo is 0.3. NASA’s Earth Factsheet lists it as 0.306. I have always used 0.3 for quick calculations. Most climate science guides on the internet also use this value.

The history of albedo-finding is shown in this paper:

History of Albedo

The history generally shows a reduction in albedo. The real question is: is albedo actually reducing, or are our measurement techniques refining?

This matters a great deal, because:

A drop of as little as 0.01 in Earth’s albedo would have a major warming influence on climate—roughly equal to the effect of doubling the amount of carbon dioxide in the atmosphere, which would cause Earth to retain an additional 3.4 watts of energy for every square meter of surface area.

NASA

The radiative forcing formula to make the above quote true would have to be:

Forcing = 4.906 * ln(new_co2 / old_co2)   { W/m² }

Because 4.906*ln(2) = 3.4. IPCC uses the value of 5.35 rather than 4.906, but I have to go with NASA here.

co2levels.org reminds us that CO2 concentration has increased from ~370ppm in 2000 to ~420ppm today. The theoretical forcing since 2000 would be:

4.906*ln(420/370) = 0.622 W/m²

Back to albedo …

The best available albedo data from 2000 to 2021 is available from CERES ( “the only project worldwide whose prime objective is to produce global climate data records of Earth’s Radiation Budget” ). You can download it here, after you register here. Here is what the data shows:

Albedo Change 2000 - Now:
0.2929 - 0.2891 = 0.0038

The theoretical albedo forcing would thus be

3.4 * 0.0038/0.01 = 1.292 W/m²

Thus the albedo forcing is twice as high as CO2!

It’s because it’s twice as high that the IPCC and other climate change advocacy groups do not use albedo at all. They refer to surface albedo, which favors slight cooling, but not to the atmospheric albedo which would ruin their neat narrative.

IPCC, AR5 Figure 8.17

The atmospheric albedo forcing from 2000 to 2021 is ~77% of the entire theoretical CO2 forcing from 1750 to just before that report was issued in 2013.

Think about it. Take care. -Zoe

Code

# Zoe Phin, 2021/06/01
# File: alb.sh
# Run: . alb.sh; require; download; plot

require() { sudo apt-get install -y nco gmt gnuplot; }

download() { echo "	No automated download. Follow instructions:
    Register Account at: https://urs.earthdata.nasa.gov/
    Manually Download with a web browser:
    https://asdc.larc.nasa.gov/data/CERES/EBAF/TOA_Edition4.1/CERES_EBAF-TOA_Edition4.1_200003-202103.nc
    Then move file to this directory."
}

plot() {
    cmd="ncks CERES_EBAF-TOA_Edition4.1_200003-202103.nc"
    $cmd -v gtoa_sw_all_mon --trd -HC | sed \$d | awk -F= '{print $3}' > .swa
    $cmd -v gsolar_mon --trd -HC | sed \$d | awk -F= '{print $3}' > .sun
    paste -d ' ' .swa .sun | awk '{	printf "%.3f %.4f\n", 
        2000+1/6+NR/12+1/24, $1/$2 }' | gmt gmtregress | sed 1d | tee .plt | sed -n '1p;$p' | awk '{
            printf "%.4f ", $3} END {print ""} ' | awk '{print $1" - "$2" = "$1-$2}'
    echo "set term png size 740,560
    set key out top center horizontal
    set xrange [1999.5:2021.5]
    set format y '%.03f'
    set mxtics 5; set mytics 5
    set grid 
    plot '.plt' u 1:2 t 'Albedo' w l lw 2 lc rgb '#0000EE',\
             '' u 1:3 t 'Linear Regression' w l lw 2 lc rgb '#000077'
    " | gnuplot > alb.png
}

Atlantic Hurricanes Trend

Climate alarmists claim that Atlantic hurricanes will increase in frequency and intensity due to emission of carbon dioxide. Is this true?

NOAA provides the data (HURDAT2) we need to examine this claim. Let’s first look at the frequency of hurricanes:

Hurricane Occurrences per Year

Their first claim has some evidence, but let’s give this some thought: is measuring the frequency really sensible? Wouldn’t it make more sense to measure the amount of time the Atlantic spends in hurricane mode? Yes, I think that is a better measure.

Hours of Hurricanes per Year

The amount of hours of hurricanes per year shows absolutely no trend!

What about their second claim: Is intensity increasing?

We can figure out hurricane intensity using a hurricane’s lowest pressure as a proxy. The lower the pressure the more intense the storm.

Here is all the hurricanes and their lowest pressure value:

Hurricane #’s Lowest Pressure

There is absolutely no trend in hurricane intensity in nearly 170 years!

Clearly, climate alarmists are wrong in regard to Atlantic hurricanes.

That’s all. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/05/20
# File: atlhur.sh
# Run: source atlhur.sh; require; download; freqplot; hoursplot; presplot
# Output: freq.png, hours.png, pres.png

require() { sudo apt-get install -y gnuplot; }

download() { wget -cO atl.csv "https://www.nhc.noaa.gov/data/hurdat/hurdat2-1851-2019-052520.txt"; }

cma() {
    cut -c6- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++) s+=$j/(p+1)
        printf "%.4f\n", s
    }}'
}

freqplot() {
    cat atl.csv | tr -d '\n' | sed 's/AL[0-9]/\nAL/g' | grep HU | awk '{
        print substr($1,4,4)}' | uniq -c | awk '{ 
            print $2" "$1 }' | tee freq.csv | cma 10 > freq.cma

    paste -d ' ' freq.csv freq.cma > freq.plt
    
    echo "set term png size 740,480; set mytics 2
    set key outside top center horizontal
    set grid; set xrange [1850:2020]
    plot 'freq.plt' u 1:2 t 'Hurricanes' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > freq.png
}

hoursplot() {
    awk -F, '$3=="  " && $4==" HU" { 
        print substr($1,1,4) }' atl.csv | uniq -c | awk '{ 
            print $2" "$1*6 }' | tee hours.csv | cma 10 > hours.cma

    paste -d ' ' hours.csv hours.cma > hours.plt
    
    echo "set term png size 740,480; set mytics 5
    set key outside top left horizontal
    set grid; set xrange [1850:2020]
    plot 'hours.plt' u 1:2 t 'Hurricane Hours' w l lt 3 lw 2,'' u 1:3 t '10yr CMA' w l lt 6 lw 4
    " | gnuplot > hours.png
}

presplot() {
    cat atl.csv | awk -F, 'NF==4 { print $1 } NF>4 && $3=="  " && $4==" HU" && $8!=" -999" { print $8
        }' | tr -d '\n' | sed 's/AL/\nAL/g' | awk 'NF>1{print}' | awk '{
            min=9999; for (i=2;i<=NF;i++) if ($i < min) min=$i; printf "%04d %d\n", NR, min
        }' | tee pres.csv | cma 50 > pres.cma

    paste -d ' ' pres.csv pres.cma > pres.plt

    echo "set term png size 740,480; set mytics 2
    set key outside top left horizontal
    set grid; set yrange [1005:880]
    set xrange [0:611]; set xlabel 'Hurricane #'
    set xtics nomirror; set x2tics ('1853' 1, '1900' 73, '1950' 182, '(Year)' 305, '2000' 466,'2019' 606)
    plot 'pres.plt' u 1:2 t 'Lowest Pressure' w l lt 3 lw 2,'' u 1:3 t '50 Hurricanes CMA' w l lt 6 lw 4
    " | gnuplot > pres.png
}

Land Drought Index Trend

The Standard Precipitation-Evapotranspiration Index (SPEI) data gives us anomaly drought conditions over land spanning from 1901 to 2018 (inclusive) in monthly 0.5 degree latitude/longitude format. Today I combined all this grid data into a global land-only drought anomaly index and show its trend over time. Result:

SPE Index

Looks like it’s getting dryer over land, but it also looks cyclical. Time will tell.

Enjoy 🙂 -Zoe

Chart data archived here.

Update

Northern Hemisphere
Southern Hemisphere
Tropics
Poles

Drying in NH. Tiny drying in SH. Drying in the tropics. Wetter at the poles.

Note: Tropics = abs(latitude) < 23.5, Poles = abs(latitude) > 66.5

Code

# Zoe Phin, 2021/05/19
# File: spei.sh
# Run: source spei.sh; require; download; alltime; plot
# Output: spei.csv, spei.yoy, spei.png

require() { sudo apt-get install -y nco gmt; }

download() {
    wget -O spei.nc --no-check-certificate https://digital.csic.es/bitstream/10261/202305/2/spei01.nc
}

onetime() {
    ncks -HC --trd -v spei spei.nc -d time,$1,$1 | sed \$d | awk -F '[= ]' '
    $8 != "_" { 
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
        A=(a/2*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2
        SA+=A; S+=$8*A
    } END { print S/SA }'
}

alltime() {
    for t in {0..1415}; do
        awk -vt=$t 'BEGIN{printf "%6.2f ", 1901+t/12+1/24}'
        onetime $t
    done | tee spei.csv
}

annual() {
    cat spei.csv | sed \$d | awk '{
        Y[substr($1,1,4)] += $2/12
    } END {
        for (y in Y) printf "%4d %.4f\n", y, Y[y]
    }'
}

yoy() {
    cat spei.csv | cut -c9- | tr '\n' ' ' | awk -vp=$1 '{
    for (i=0;i<p/2;i++) print ""
    for (i=p/2;i<=NF-p/2;i++) { s=0
        for (j=i-p/2; j<=i+p/2; j++)
            s+=$j/(p+1)
        printf "%.4f\n", s
    }}' > spei.yoy
}

plot() { 
    yoy 120; paste -d ' ' spei.csv spei.yoy > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal
        set ytics format '%4.2f'
        set mxtics 2; set mytics 5
        set xrange [1900:2020]
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Wetter' w filledcurve above y1=0 lw 2 lc rgb '#0000DD',\
                     '' u 1:2 t 'Dryer'  w filledcurve below y1=0 lw 2 lc rgb '#DD0000',\
                     '' u 1:3 t '10yr CMA' w lines lw 3 lc rgb 'black'		
    " | gnuplot > spei.png 
}

Warming near Coal Plants

I was learning some Python over the weekend, and then afterwards decided to go for a jog at my favorite spot. As I was jogging, I thought about finding out what the temperature trend at this location was by extracting and plotting it with Python. This site seems to offer the best local 4km data for the continental US.

My first Python program:

import requests as req
import numpy as np
import matplotlib.pyplot as mp

loc = {
	'spares': '4km', 'call': 'pp/yearly_timeseries', 'proc': 'gridserv', 'units': 'si',
	'stats': 'tmean', 'start': '1895', 'end': '2019', 'lon': '-84.3922', 'lat': '34.0000'
}

res = req.post('https://prism.oregonstate.edu/explorer/dataexplorer/rpc.php', data=loc)

tmean = res.json()['result']['data']['tmean']

y = np.array(tmean)
x = np.arange(1895,2020)
mp.plot(x,y)
m, b = np.polyfit(x, y, 1)
mp.plot(x, m*x+b)

mp.tight_layout()
mp.savefig('loc.png')

Result:

Azalea Park, Roswell, GA

Well look at that: no warming where I like to jog. How about downtowns of some big cities?

OK. Definitely warming here.

Then I had a great idea. What if I chose spots with coal plants?

I got a list of the top 20 coal plants in the USA from here.

Results:

Bowden               -0.0076
Scherer              -0.0084
Gibson               -0.0098
Monroe               -0.0005
Amos                 -0.0020
Miller               -0.0071
Parish               +0.0044
Cumberland           -0.0059
Gavin                -0.0071
Rockport             -0.0030
Paradise             -0.0048
Roxboro              -0.0090
Sammis               -0.0005
Stuart               -0.0070
Navajo               +0.0116
Sherburne            +0.0032
Martin               -0.0048
Belews               -0.0045
Jeffrey              -0.0062
Gaston               -0.0057

Units are degrees Celcius per year, based off the linear regression.

Only 3 out of 20 coal plant locations experienced warming in the last century. 85% had cooling!

I suspect this fact will not cause climate alarmists to pause and think for one second.

What do skeptics think?

Enjoy 🙂 -Zoe

Corrections:

Above, Sammis image turned out to be a duplicate of Roxboro. Corrected.

Also, a typo: LogAngeles = Los Angeles

Code:

# Zoe Phin, 2021/05/09
# File: prism.py
# Setup: sudo apt-get install python3-numpy python3-matplotlib
# Run: python3 prism.py

import requests as req
import numpy as np
import matplotlib.pyplot as mp

locations = {
'Roswell':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34','lon':'-84.385'},
'NYC':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'40.7', 'lon':'-74'},
'LosAngeles':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34.0543', 'lon':'-118.2438'},
'WashingtonDC':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.8968','lon':'-77.0366'},
'Chicago':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'41.8758','lon':'-87.6191'},

'Bowden':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'34.1271','lon':'-84.9155'},
'Scherer':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.0667','lon':'-84.8'},
'Gibson':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.3697','lon':'-87.7702'},
'Monroe':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'41.8881','lon':'-83.3453'},
'Amos':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.4746','lon':'-81.8233'},

'Miller':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.6327','lon':'-87.0595'},
'Parish':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'29.4797','lon':'-95.6320'},
'Cumberland':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.3906','lon':'-87.6537'},
'Gavin':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.9366','lon':'-82.1162'},
'Rockport':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'37.9268','lon':'-87.0354'},

'Paradise':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'37.2585','lon':'-86.9799'},
'Roxboro':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.4840','lon':'-79.0725'},
'Sammis':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'40.5686','lon':'-80.4311'},
'Stuart':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'38.6375','lon':'-83.6921'},
'Navajo':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.9054','lon':'-111.3877'},

'Sherburne':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'45.3795','lon':'-93.8960'},
'Martin':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'32.2597','lon':'-94.5692'},
'Belews':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'36.2824','lon':'-80.0592'},
'Jeffrey':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'39.2857','lon':'-96.1166'},
'Gaston':{'spares':'4km','call':'pp/yearly_timeseries','proc':'gridserv','units':'si','stats':'tmean','start':'1920','end':'2019','lat':'33.2430','lon':'-86.4595'}}

for name in locations:
    res = req.post('https://prism.oregonstate.edu/explorer/dataexplorer/rpc.php',data=locations[name])

    y = np.array(res.json()['result']['data']['tmean'])
    x = np.arange(1920,2020)

    mp.title(name)
    mp.plot(x, y)
    m, b = np.polyfit(x, y, 1)
    print("%-20s %+.4f" % (name, m))
    mp.plot(x, m*x+b)

    mp.tight_layout()
    mp.savefig(name+'.png')
    mp.cla()

Snow in the Era of Global Warming

NOTE: THIS ARTICLE IS DEPRECATED. GO HERE FOR UPDATE.

Is anyone curious to know what the global snowfall trend was in this era of “extreme” global warming?

I was. Luckily NASA covertly provides us with all the necessary data to figure this out.

March 2021

I downloaded all available monthly images from 1980 to 2020 (inclusive), such as the one shown above, then I converted the pixel colors back to data using the provided scale.

The error margin is small and time-persistent and so this is a clever way to extract a rich dataset which I haven’t been able to find anywhere else.

As far as I know, you will not see this anywhere else. All other snowfall or snow-cover datasets are limited by region or date and so researchers reach the wrong conclusion!

Here is the result of my quest:

Global Snowfall
2.773 -> 2.854 is +2.90%

Snowfall has increased by nearly 3 percent over the last four decades!

Units are milligrams per square meter per second.

Let’s also see how this breaks down by North and South hemisphere:

North Hemisphere Snowfall
2.722 -> 2.468 is -9.35%
South Hemisphere Snowfall
2.824 -> 3.239 is +14.71%

SH increase in snow more than compensates NH decrease in snow. This led to an overall increase in snow during our great era of global warming!

Chart data is archived @ https://pastebin.com/raw/XpsVwdjj

That’s it. Enjoy 🙂 -Zoe

Code:

# Zoe Phin, v2 - 2021/05/07
# File: snow.sh
# Run: source snow.sh; download; index; plots
# Output: snow.png

require() { sudo apt-get install -y gmt gnuplot netpbm; }

download() {
    for y in {1980..2020}; do
        for m in {01..12}; do
            d="$y-$m-01"
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MERRA2_Snowfall_Monthly&TIME=$d'"
        done
    done > sets.sh
    bash sets.sh
}

scale() {
    for m in {01..12}; do
        pngtopnm 2020-$m-01.png | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
            printf "%03d %03d %03d\n", $1, $3, $2}' 
    done | sort -r | uniq | awk '{
        printf "s/%s %s %s/%0.2f/\n", $1, $3, $2, (NR-1)/140*7 }' > replace.sed
}

all() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)-89.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

nhs() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 1,32400p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)-89.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

shs() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 32401,64800p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | awk '{
        l=sprintf("%d",(NR-1)/360)+0.5; a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2; SA+=A; S+=$1*A
    } END { printf "%.6f\n", S/SA }'
}

index() {
    scale
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; all $f; done | tee all.csv
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; nhs $f; done | tee nhs.csv
    for f in $(ls -1 [12]*.png); do echo -n "${f/.png/} "; shs $f; done | tee shs.csv
}

linear() {
    cat $1.csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f -> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    echo -n "$1: "
    linear $1; paste -d ' ' $1.csv .lin > plot.csv
    echo "set term png size 740,470
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%.1f'
        set xtics 157788864
        set ytics 0.2; set mxtics 5; set mytics 2
        set xrange ['1979-11-01':'2021-03-01']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:(10*\$2) t 'Snowfall (mg/m²/s)' w lines lw 2 lc rgb '#0000CC',\
                     '' u 1:(10*\$3) t 'Linear Regression'  w lines lw 3 lc rgb '#000055'		
    " | gnuplot > snow-$1.png 
}

plots() { plot all; plot nhs; plot shs; }

archive() {
    ( echo "Date,Global,NH,SH"; 
    paste all.csv nhs.csv shs.csv | awk '{print $1","$2","$4","$6}' ) > data.csv
}

Tiniest Useful Executable

A long long time ago, when I got my first job on wall street, I had the opportunity to partake in writing some of the fastest high frequency financial execution engines possible. Although I barely remember the x86 assembly language, I still haven’t completely forgotten it either.

I wanted to use some of these forgotten skills to write some fast climate data gathering programs because there’s too much bloat in existing software, but honestly, it’s just too tough and I don’t have the time. But in trying, I wrote the tiniest useful program possible on the x86 platform.

While some girls like to solve standard puzzles, I’ve always enjoyed putting my autistic skills to finding new exotic puzzles to solve.

So what puzzle am I trying to solve?

A: Creating the smallest program on linux that actually does something useful. This program must be its own executable and not just text interpreted by another program.

What can be more useful than a being able to do anything a computer can do?

How about a compiler? …. What kind of a compiler?

How about compiling 1’s and 0’s into runable code?

When was the last time anyone actually programmed by typing only 1’s and 0’s on the keyboard?

Well, I’m bringing it back!

First I needed to find out how to make the smallest runable program on linux, and its filesize. Luckily someone figured this out in a very clever way. Thank you, Brian Raiter.

Here, at last, we have honestly gone as far as we can go. There is no getting around the fact that the 45th byte in the file, which specifies the number of entries in the program header table, needs to be non-zero, needs to be present, and needs to be in the 45th position from the start of the ELF header. We are forced to conclude that there is nothing more that can be done.

Brian Raiter

Here is Brian’s solution:

; tiny.asm - Brian Raiter
BITS 32
        org     0x00010000

        db      0x7F, "ELF"  
        dd      1           
        dd      0           
        dd      $$          
        dw      2           
        dw      3           
        dd      _start      
        dd      _start      
        dd      4          
_start:
        mov     bl, 42    ; M
        xor     eax, eax  ; E
        inc     eax       ; A   
        int     0x80      ; T
        db      0
        dw      0x34          
        dw      0x20  
        db      1        

; -----------------------     
; $ nasm -f bin -o a.out tiny.asm
; $ chmod +x a.out
; $ ./a.out ; echo $?
; 42
; $ wc -c a.out
;      45 a.out

This 45 byte program (tiniest possible) simply sets the exit code to 42, and then exits. The real meat of his program is just 7 bytes, and can actually be 5 (no xor eax,eax) with the same result.

While Brian used NASM, I prefered to use FASM, since I’m more familiar with it.

Now here’s my tiny masterpiece:

;   Zoe Phin, 2021/04/24
;   bex.asm
    use32
    org         0x505A4000
      
    db          0x7F, "ELF"     
    dd          1                               
    dd          0                               
    dd          $$                              
    dw          2               
    dw          3               
    dd          start           
    dd          start           
    dd          4               

    start:      pop     ecx     
                pop     edi
                jmp     .end    

    .arg:       pop     esi     

        .char:  lodsb
                cmp     al, 0
                jmp     .cont   

    dw          32              
    dd          1               

        .cont:  je      .done
                rcr     al, 1
                adc     ah, ah
                jmp     .char

        .done:  mov     [edi+edx], ah
                inc     edx
                xor     eax,eax

    .end:       loop    .arg
                jmp     edi

I call it bex, short for Binary EXecution. What does it do?

It converts every argument consisting of ASCII 1’s and 0’s on the command line into its binary representation, and then jumps execution to the first argument.

So for example, the shortest program would just be just to exit cleanly without segfault:

$ ./bex 01000000 11001101 10000000

The ASCII binary represents opcodes for

inc eax    ; 01000000
int 80h    ; 11001101 10000000

Brian’s program would be:

mov bl, 42 ; 10110011 00101010
inc eax    ; 01000000
int 80h    ; 11001101 10000000

Run it:

$ ./bex 10110011 00101010 01000000 11001101 10000000
$ echo $?
42
$

And now I test a semi-Quine:

$ ./bex 10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000 | xxd -b -c10 | cut -c11-99

10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000

It’s a semi-Quine because the output is a binary representation of the ASCII input, but not the ASCII input itself.

The assembly code equivalent is:

mov     al, 4    ; 10110000 00000100
inc     ebx      ; 01000011
mov     ecx,edi  ; 10001001 11111001
int     80h      ; 11001101 10000000
xchg    eax,ebx  ; 10010011
int     80h      ; 11001101 10000000

A few notes:

  • Leading 0’s are not needed in an argument ( ex: 101 = 00000101 )
  • 1’s and 0’s are not specifically needed. 1 is an odd ASCII value, while 0 is an even ASCII value
  • Any non-binary data can also be placed directly inline

Example of notes:

$ ./bex 7.77.... 188 1....11 10110010 ++.+ 30003003 1111---1 121221 11414441 11661121 9TTTTTTT 1RR1PP11 11351171 9:::.::: "Hello World!"

Hello World!

Assembly equivalent:

mov     al, 4    ; 10110000 00000100 ; 7.77.... 188
inc     ebx      ; 01000011          ; 1....11
mov     dl, 13   ; 10110010 00001101 ; 10110010 ++.+
mov     ecx,esi  ; 10001001 11110001 ; 30003003 1111---1
sub     ecx,edx  ; 00101001 11010001 ; 121221 11414441 
int     80h      ; 11001101 10000000 ; 11661121 9TTTTTTT
xchg    eax,ebx  ; 10010011          ; 1RR1PP11
int     80h      ; 11001101 10000000 ; 11351171 9:::.:::

The code basically tells linux to write starting 13 characters from the end.

You can obtain bex by writing the binary directly and making it executable, like so:

$ echo -en $(printf "\\\x%s" 7F 45 4C 46 01 00 00 00 00 00 00 00 00 40 5A 50 02 00 03 00 20 40 5A 50 20 40 5A 50 04 00 00 00 59 5F EB 1A 5E AC 3C 00 EB 06 20 00 01 00 00 00 74 06 D0 D8 10 E4 EB ED 88 24 17 42 31 C0 E2 E4 FF E7) > bex; chmod +x bex

Or you can execute a script to download FASM and compile and test from source code:

$ bash <<< $(wget -qO- https://pastebin.com/raw/Tvmg0nr6 | tr -d '\r')

Successful result:

flat assembler  version 1.73.27  (16384 kilobytes memory)
2 passes, 66 bytes.
Input:  10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000
Output: 10110000 00000100 01000011 10001001 11111001 11001101 10000000 10010011 11001101 10000000
Match!

The tiniest useful executable is only 66 bytes!

Mission accomplished. Puzzle solved.

You can do pretty much anything in it! Its inconvenience is a feature 😉

A few notes on the run environment:

  • edx is set to number of arguments
  • edi points to first argument
  • esi points to NULL located after arguments, just before ENVironment variables.

Yes, this is obviously a 32-bit program, but it runs perfectly fine on x86-64 based linux.

Enjoy 🙂 -Zoe

P.S.: As a bonus, bex includes my initials directly in the binary:

00000000: 7F 45 4C 46 01 00  .ELF..
00000006: 00 00 00 00 00 00  ......
0000000c: 00 40 5A 50 02 00  .@ZP..
00000012: 03 00 20 40 5A 50  .. @ZP
00000018: 20 40 5A 50 04 00   @ZP..
0000001e: 00 00 59 5F EB 1A  ..Y_..
00000024: 5E AC 3C 00 EB 06  ^.<...
0000002a: 20 00 01 00 00 00   .....
00000030: 74 06 D0 D8 10 E4  t.....
00000036: EB ED 88 24 17 42  ...$.B
0000003c: 31 C0 E2 E4 FF E7  1.....

4 Decade Global Snowfall Trend

NOTE: THIS POST AND CODE IS DEPRECATED. PLEASE GO HERE FOR UPDATED VERSION.

Is anyone curious to know what the global snowfall trend was in this era of “extreme global warming”?

I was. Luckily NASA covertly provides us with all the necessary data to figure this out.

January 2021

I downloaded all available monthly images from 1980 to 2020 (inclusive), such as the one shown above, then I converted the pixel colors back to data using the provided scale.

The error margin is small and time-persistent and so this is a clever way to extract a rich dataset which I haven’t been able to find anywhere else.

As far as I know, you will not see this anywhere else. All other snowfall or snow-cover datasets are limited by region or date and so researchers reach the wrong conclusion!

Here is the result of my quest:

Global Snowfall
0.2773 -> 0.2854 is +2.90%

Snowfall has increased by nearly 3 percent over the last four decades.

Let’s also see how this breaks down by North and South hemisphere:

North Hemisphere Snowfall
0.2722 -> 0.2468 is -9.35%
South Hemisphere Snowfall
0.6257 -> 0.7057 is +12.77%

That’s it. Enjoy 🙂 -Zoe

NOTE: THIS POST AND CODE IS DEPRECATED. PLEASE GO HERE FOR UPDATED VERSION.

Correction:

The units are in kilograms divided by 100000 (better known as centigrams), not kilograms. I forgot about the original scale and mislabeled the charts. I won’t fix the chart. The purpose was to find the trend.

Update:

You can generate your own charts using data archived here.

Code:

# Zoe Phin, 2021/05/04
# File: snow.sh
# Run: source snow.sh; download; index; plot
# Output: snow.png

require() { sudo apt-get install -y netpbm gmt; }

sets() {
    for y in {1980..2020}; do
        for m in {01..12}; do
            d="$y-$m-01"
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MERRA2_Snowfall_Monthly&TIME=$d'"
        done
    done > sets.sh
}

download() { 
    sets; bash sets.sh; 
    find . -name \*png -type f -size -10k -exec rm {} \;
}

scale() {
    for m in {01..12}; do
        pngtopnm 2020-$m-01.png | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
            printf "%03d %03d %03d\n", $1, $3, $2}' 
    done | sort -r | uniq | awk '{
        printf "s/%s %s %s/%0.2f/\n", $1, $3, $2, (NR-1)/140*7 }' > replace.sed
    echo "s/000 000 000/999/" >> replace.sed
}

onefile() {
    pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | sed -n 1,64800p | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | sed 's/... ... .../999/' | awk '$1!=999{
        l=sprintf("%d",(NR-1)/360)-89.5
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA+=A; S+=$1*A
    } END {
        printf "%.6f\n", S/SA
    }'
}

index() {
    scale
    for f in $(ls -1 [12]*.png); do
        echo -n "${f/.png/} "
        onefile $f
    done | tee .csv
}

linear() {
    cat .csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f -> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "set term png size 740,560
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%.2f'
        set xtics 157788864
        set ytics 0.02; set mxtics 5; set mytics 2
        set xrange ['1979-11-01':'2021-03-01']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Snowfall (kg/m²/s)' w lines lw 2 lc rgb '#0000CC',\
                     '' u 1:3 t 'Linear Regression'  w lines lw 3 lc rgb '#000055'		
    " | gnuplot > snow.png 
}

Trend in Gross Primary Production

I downloaded all 8-day increment Gross Primary Production data from NASA from 2001 to 2020 (inclusive) to find the 20-year trend.

April 23, 2021

The Result:

+9.17% Rise in Gross Primary Product
0.0225 --> 0.0246, is +9.17%

This is all good news. Enjoy 🙂 -Zoe

The Code:

# Zoe Phin, 2021/05/03
# File: prod.sh
# Run: source prod.sh; download; index; plot
# Output: trend info & prod.png

require() { sudo apt-get install -y netpbm gmt; }

sets() {
    for y in {2001..2020}; do
        let n=0
        for w in {0..45}; do
            d=$(date -d "$y-01-01 +$n days" +%Y-%m-%d)
            let n+=8
            echo "wget -O $d.png 'https://gibs.earthdata.nasa.gov/wms/epsg4326/all/wms.cgi?REQUEST=GetMap&SERVICE=WMS&FORMAT=image/png&VERSION=1.1.1&SRS=epsg:4326&BBOX=-180,-90,180,90&TRANSPARENT=TRUE&WIDTH=360&HEIGHT=180&LAYERS=MODIS_Terra_L4_Gross_Primary_Productivity_8Day&TIME=$d'"
        done
    done > sets.sh
}

download() { 
    sets; bash sets.sh; 
    find . -name \*png -type f -size -10k -exec rm {} \;
}

scale() {
    awk 'BEGIN{
        for (i=100;i<=255;i++) printf "%03d 000 000\n", i
        for (i=000;i<=255;i++) printf "255 %03d 000\n", i
        for (i=255;i>=000;i--) printf "%03d 255 000\n", i
        for (i=000;i<=255;i++) printf "000 255 %03d\n", i
        for (i=255;i>=000;i--) printf "000 %03d 255\n", i
    }' | awk '{printf "s/%s/%0.6f/\n", $0, (NR-1)/1180*0.095 }' > replace.sed
    echo "s/000 000 000/999/" >> replace.sed
}

onefile() {
    scale; pngtopnm $1 | pnmtoplainpnm | sed '1,3d;s/  /\n/g' | awk '{
    printf "%03d %03d %03d\n", $1, $2, $3}' | sed -f replace.sed | sed 's/... ... .../999/' | awk '$1!=999{
        l=sprintf("%d",(NR-1)/360)-89.5
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180; 
        A=(a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA+=A; S+=$1*A
    } END {
        printf "%.6f\n", S/SA
    }'
}

index() {
    for f in $(ls -1 2*.png); do
        echo -n "${f/.png/} "
        onefile $f
    done | tee .csv
    cp .csv bak.csv
}

linear() {
    cat .csv | sed \$d | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 }' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f --> %.4f, is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal reverse
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01; set mxtics 2; set mytics 5
        set xrange ['2000-12-01':'2020-02-01']
        set grid xtics mxtics ytics mytics
        plot 'plot.csv' u 1:2 t 'Gross Primary Production (kg C/m²)' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#005500'		
    " | gnuplot > prod.png 
}

Global Water Dynamics

I think this dataset is very interesting. We can monitor global water level changes from 1999 to 2020 layered on top of google maps here. I naturally chose the Maldives as the starting point to see if the tiny islands nation is “doomed” as climate alarmists like to claim. Of course it isn’t.

Male, Maldives

Seldom any water level gains and plenty of reclamation. You are free to browse to other nearby islands and see that indeed there is very little encroachment of water.

The story is a little different for Bangladesh:

South Bangladesh

There is some water gain in certain areas, but there is also loss in other areas. There is little evidence of encroachment from the ocean. Just river dynamics.

Florida:

South Florida

I don’t see much coastal water encroachment.

I’d love to run some analysis on this data, but the amount of it is simply overwhelming. Safe to say that since these data researchers didn’t quantify coastal water gain/loss for us it most likely doesn’t favor the alarmist position.

Anyways, Enjoy 🙂 -Zoe

Sorry I’ve been very busy lately.

Trend of Chlorophyll in Water

NASA has a data product that tracks the amount of chlorophyll in water across the globe.

I downloaded all available 2003-2020 (inclusive) monthly data in 1440 by 720 pixel format to see how chlorophyll in water changes over time.

This task is actually not easy because there’s a lot of missing data (black pixels). I decided to use only non-missing pixels that are persistent across all 216 months. There are 27998 of them. That’s 2.7% of the globe.

This is a map of these persistent locations:

Omnipresent Chlorophyll Measures

Here’s the trend for these locations:

Change (2003,2020): 2.942 %
Linear Regression Change (2003-2020): 2.736 %

That’s a nice positive change.

I then thought about it some more, and decided to do it on a monthly basis. Here’s a map of all persistent January pixels for all 18 years:

And here is All July:

Here is the linear regression trend for every month:

01 -0.046 %
02 +4.834 %
03 +8.112 %
04 +1.008 %
05 -7.842 %
06 -1.898 %
07 +1.130 %
08 +5.267 %
09 +6.818 %
10 +1.999 %
11 +5.372 %
12 +3.786 %

Avg: +2.378 %

Enjoy 🙂 -Zoe

Update

Some more data:

Note the rise in 1997 to 1998. That was a very warm season. Look at the rise! How can anyone be against global warming, except the cult of death?

Code

# Zoe Phin, 2021/03/24
# File: chloro.sh
# Run: source chloro.sh; require; download; convert; chloro; cmap; plot; change; annual; months

require() { sudo apt-get install -y gmt gnuplot; }

download() {
    rm -f *.csv
    for y in {2003..2020}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MY1DMM_CHLORA&year=$y"
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
    awk '{print "wget -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=1440&height=720\""}' sets.csv > sets.sh
    bash sets.sh
}

convert() {
    for f in $(ls -1 2*.csv); do
        cat $f | tr ',' '\n' > ${f/.csv/.dat}
    done
    ls -1 *.dat | xargs paste -d ' ' | nl > .db
    grep -v '99999.0' .db > chloro.db
    rm -f *.dat .db
}

chloro() {
    awk 'BEGIN {
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
    } { 
        l = sprintf("%d", ($1-1)/1440)/4-89.875
        A = (a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        SA += A
        for (i=2; i<=NF; i++) 
            S[i] += $i*A
    } END {
        for (m in S)
            printf("%.5f\n", S[m]/SA)
    }' chloro.db | awk '{
        printf "%.2f %.5f\n", 2003+(NR-1)/12+1/24, $1
    }' > chloro.csv
}

cmap() {
    awk '{
        lat = sprintf("%d", ($1-1)/1440)/4-89.875
        lon = ($1-1)%1440*360/1440-180
        printf "%.3f %.2f\n", lat, lon
    }' chloro.db > cmap.csv
    echo "set term png size 560,360
        set margin 4,2,2,1
        set nokey 
        set yrange [-90:90]
        set xrange [-180:180]
        set ytics 30; set xtics 30
        set grid ytics xtics
        plot 'cmap.csv' u 2:(-\$1) t 'Chlorophyll' pt 5 ps 0.1 lc rgb '#00CC00'
    "| gnuplot > cmap.png 
}

plot() { 
    echo -n "Linear Regression Slope: "
    cat chloro.csv | gmt gmtregress -Fp -o5 | awk '{printf "%.5f /year\n", $1}'
    cat chloro.csv | gmt gmtregress | sed 1d | awk '{printf "%.2f %.5f %.5f\n", $1, $2, $3}' > plot.csv
    echo "set term png size 740,620
        set key outside top center horizontal
        set mxtics 2; set mytics 2
        set format y '%.2f'
        set xrange [2003:2021]
        set grid xtics mxtics ytics mytics
        plot 'plot.csv' u 1:2 t 'Chlorophyll (mg/mÂł)' w lines lw 2 lc rgb '#00CC66',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#006666'		
    "| gnuplot > chloro.png 
}

change() {
    echo -n "Change (2003,2020): "
    awk '{Y[substr($1,1,4)]+=$2/12} END { printf "%.3f %\n", (Y[2020]/Y[2003]-1)*100 }' plot.csv
    echo -n "Linear Regression Change (2003-2020): "
    awk '{Y[substr($1,1,4)]+=$3/12} END { printf "%.3f %\n", (Y[2020]/Y[2003]-1)*100 }' plot.csv
}

annual() {
    awk '{Y[substr($1,1,4)]+=$2/12} END { for (y in Y) printf "%s %.5f\n", y, Y[y]}' chloro.csv
}

months() {
    for m in {01..12}; do
        echo -n "$m "
        for f in $(ls -1 2*-$m-*.csv); do
            cat $f | tr ',' '\n' > ${f/.csv/.dat}
        done
        ls -1 2*-$m-*.dat | xargs paste -d ' ' | nl > .db
        grep -v '99999.0' .db | tee chloro.db | awk 'BEGIN {
            a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        } { 
            l = sprintf("%d", ($1-1)/1440)/4-89.875
            A = (a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
            SA += A
            for (i=2; i<=NF; i++) 
                S[i] += $i*A
        } END {
            for (m in S)
                printf("%.7f\n", S[m]/SA)
        }' | awk '{
        printf "%.2f %.5f\n", 2003+(NR-1), $1
        }' | gmt gmtregress | sed 1d | awk '{Y[substr($1,1,4)]+=$3/12} END { printf "%+.3f %\n", (Y[2020]/Y[2003]-1)*100 }'
    done | tee .seas
    awk '{S+=$2} END { printf "\nAvg: %+.3f %\n", S/NR }' .seas
}

Coastal Sealevel Rise

Climate alarmists are worried that the sea level is rising too fast and flooding is coming soon. You can find many data images like this on the net:

3.2 mm/year. The problem is that this is for all ocean water. If flooding is a concern, then shouldn’t we ask what is happening at the coasts? Is it different than the oceans as a whole?

I decided to find out.

I downloaded over a gigabyte of 720×361 gridded data covering 1950 to 2009. I only examine those grid cells that are adjacent to land (2808 out of 259920).

Here is my result:

1950  -4.387969	1970  -1.113571	1990  +0.478104
1951  -3.858797	1971  -0.531201	1991  +0.651717
1952  -4.040961	1972  -0.770646	1992  +0.665983
1953  -3.568315	1973  -1.020810	1993  +0.086299
1954  -3.699824	1974  -0.375272	1994  +1.093564
1955  -2.807692	1975  -0.504674	1995  +1.871986
1956  -3.675497	1976  -1.817893	1996  +3.126923
1957  -3.445263	1977  -1.405565	1997  +2.290404
1958  -3.788105	1978  -0.750346	1998  +4.180212
1959  -3.993297	1979  -1.387182	1999  +5.105531
1960  -2.135054	1980  -1.673765	2000  +4.515499
1961  -2.499847	1981  -0.377484	2001  +4.702255
1962  -2.632606	1982  -2.025174	2002  +3.391415
1963  -2.978503	1983  +0.697424	2003  +4.399230
1964  -3.627167	1984  +0.632456	2004  +4.762698
1965  -2.821440	1985  -0.085001	2005  +4.951383
1966  -2.954607	1986  -0.053231	2006  +5.608991
1967  -2.215466	1987  -0.204903	2007  +5.249474
1968  -2.939226	1988  +0.930574	2008  +7.056215
1969  -1.268895	1989  +0.929616	2009  +6.901877

Trend (mm/year): 1.68746

1.69 mm/year. As you can see the coastal trend is half the total ocean trend. Funny how greenhouse gases do that 😉

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/03/17
# File: coastal.sh
# Run: . coastal.sh ; require; download; alltime; analysis

require() { sudo apt-get install -y nco gmt; }

download() {
    wget -cO 1950s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19500103_19591227_v1.nc.gz
    wget -cO 1960s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19600103_19691227_v1.nc.gz
    wget -cO 1970s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19700103_19791227_v1.nc.gz
    wget -cO 1980s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19800103_19891227_v1.nc.gz
    wget -cO 1990s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_19900103_19991227_v1.nc.gz
    wget -cO 2000s.nc https://podaac-opendap.jpl.nasa.gov/opendap/allData/recon_sea_level/preview/L4/tg_recon_sea_level/CCAR_recon_sea_level_20000103_20090627_v1.nc.gz
}

onetime() {
    ncks --trd -HC -d time,$2,$2 -v ssha ${1}s.nc | sed \$d | awk -F '[= ]' '
    function s(a) { return sin(atan2(0,-1)*a/180) }
    function area(lat) { if (lat<0) lat*=-1; return (s(lat+0.25)-s(lat-0.25))/720 }
    NR==1 { T=$2 }
    $4>=-57 && $4<=57 {	
        if (NR != 1) {
            if ($8 == "_" && D != "_") { A=area(L);  S+=D*A;  N+=A }
            if ($8 != "_" && D == "_") { A=area($4); S+=$8*A; N+=A }
        }
        L=$4; D=$8
    } END {
        T=sprintf("%d",T)
        "date +%Y-%m-%d -d \"1900-01-01 +"T" days\"" | getline t
        printf "%s %+010.6f\n", t, S/N/10
    }'
}

alltime() { (
    for d in {1950..1990..10}; do
        for t in {000..519}; do
            onetime $d $t
        done
    done
    for t in {000..493}; do
        onetime 2000 $t
    done ) | tee sealevel.csv
}

analysis() {
    awk '{ Y[substr($1,1,4)]+=$2; N[substr($1,1,4)]+=1 
    } END { for (i in Y) printf "%.0f %+10.6f\n", i, Y[i]/N[i]
    }' sealevel.csv | tee .result | gmt gmtregress -Fp -o5 > .trend

    cat .result | column -c 60
    echo -n "Trend (mm/year): "
    cat .trend | awk '{print $1*10}'
}

81 Million Ballots

Supposedly 81 million actual people, not just ballots, voted for Joe Biden. There is ample reasons to suspect that foul play bumped this number by several million. You can find the evidence of it if you’re looking for it with an open mind. Even if not the case (doubtful), actual enthusiasm for this highly popular president is shall we say … laughable. One way to track this is the like-to-dislike ratio on White House Youtube Videos. The actual like-to-dislike ratio, not the manipulated one.

As you know, I analyzed this a short while ago in my article White House Youtube Dislike Manipulation. I am happy it inspired someone to keep a persistent watch on the most popular administration ever:

81m.org

Here we can see that Biden’s recent SOTU address was disliked by 93.31% of online voters.

The media might have you believe these dislikes are bots and thus Google is justified in removing them. Odd. So only dislikers would bother to use “bots”? That’s how engaged they are? Likers are not engaged? Current White House fans have no money to buy bots, or ask Google for a like boost?

No, I don’t think so. That excuse won’t fly. I think these are engaged people voting legitimately, something the establishment couldn’t care less about.

Biden’s address was stupid and creepy, and he deserved all legitimate dislikes for it, including mine.

But that’s just my opinion. This post is not about my political opinions. I just wanted to thank 81m.org for keeping track of this situation and acknowledging their inspiration.

Give them a visit.

Best regards,

-Zoe

Accurate Global Greening

In a previous post, Fortunate Global Greening, I used low resolution NASA data to determine changes to Vegetation Index. I did this because I didn’t want to spend 5 hours downloading 23 gigabytes of data for the highest resolution. I didn’t think this would matter that much, but unfortunately for me, it does. Here’s the new analysis:

0.3746 --> 0.3937 is +5.08%

I made two other changes. I now use proportion of land, rather than whole globe. That’s why these numbers are much larger than previously. I also use a linear regression so I don’t remove a whole year for a moving average.

The actual global greening this century, using highest resolution data, is a little over 5%, not nearly 10% as I previously found.

Semper veritas. -Zoe

Code

# Zoe Phin, 2021/03/07
# File: veg.sh  (Version 2.0)
# Run: source veg.sh; sets; download; index; plot

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_NDVI_16&year=$y" 
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv -c \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=3600&height=1800\""}' sets.csv > sets.sh
    bash sets.sh
    rm -f 201[789]-12-31.csv
}

index() {
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        awk -F, '{
            a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180;
            l=NR*180/1800-90.05
            A=(0.1*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
            for (i=1;i<=NF;i++) { 
                if ($i==99999) continue
                SA+=A; S+=$i*A
            }
        } END {
            printf "%.6f\n", S/SA
        }' $f
    done > .csv
}

linear() {
    cat .csv | awk '{ "date +%Y\\ %j -d "$1 | getline t; print t" "$2 } ' | awk '
        {printf "%.4f %s\n", $1+$2/365, $3}' | gmt gmtregress | awk '
        NR>1 { printf "%.6f\n", $3 }' | tee .lin | sed -n '1p;$p' | tr '\n' ' ' | awk '{
        printf "%.4f --> %.4f is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
}

plot() { 
    linear; paste -d ' ' .csv .lin > plot.csv
    echo "
        set term png size 740,620
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01
        set mxtics 2
        set mytics 5
        set xrange ['2000-01-01':'2021-02-28']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Vegetation Index ' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t 'Linear Regression' w lines lw 3 lc rgb '#005500'		
    " | gnuplot > veg.png 
}

Land Change in Australia

This is for my Aussie fans. I show how Australia’s landscape changed from 2001 to 2019 using best available satellite data.

Parts of Indonesia and Papua New Guinea that should appear on the map, are changed to water.

Changes:

Type                                |   2001  |   2019  |  Change |  % Chg
------------------------------------+---------+---------+---------+---------
Evergreen Needleleaf Forest         |  0.4615 |  0.4885 | +0.0270 |   +5.85%
Evergreen Broadleaf Forest          |  2.4580 |  2.5659 | +0.1079 |   +4.39%
Deciduous Broadleaf Forest          |  0.0094 |  0.0167 | +0.0073 |  +77.66%
Mixed Forests                       |  0.0570 |  0.0737 | +0.0167 |  +29.30%
Closed Shrubland                    |  3.8315 |  4.0744 | +0.2429 |   +6.34%
Open Shrublands                     | 54.0555 | 54.7268 | +0.6713 |   +1.24%
Woody Savannas                      |  1.6631 |  1.9115 | +0.2484 |  +14.94%
Savannas                            |  6.4679 |  6.6289 | +0.1610 |   +2.49%
Grasslands                          | 24.9655 | 23.3947 | -1.5708 |   -6.29%
Permanent Wetlands                  |  0.2007 |  0.2198 | +0.0191 |   +9.52%
Croplands                           |  3.4671 |  3.5146 | +0.0475 |   +1.37%
Urban and Built-up                  |  0.1321 |  0.1388 | +0.0067 |   +5.07%
Cropland/Natural Vegetation Mosaic  |  0.0065 |  0.0110 | +0.0045 |  +69.23%
Snow and Ice                        |  0.0001 |  0.0005 | +0.0004 | +400.00%
Barren or Sparsely Vegetated        |  2.2241 |  2.2339 | +0.0098 |   +0.44%

Columns are in overall percent, except the last, which shows percent change from 2001 to 2019.

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/28
# File: australia.sh
# Run: source australia.sh; require; download <user> <pass>; prepare; maps; animate; analyze

require() { sudo apt-get install hdf4-tools imagemagick; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse_mlc() {
    ncdump-hdf -v Majority_Land_Cover_Type_1 $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%02d ", $i}' | fold -w21600 > $1.mlc
}

parse_lct() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%03d ", $i}' | fold -w489600 > $1.dat
}

prepare() { parse_mct 2001; parse_mct 2019; parse_lct 2001; parse_lct 2019; }

aus() { ( echo -e 'P3\n830 680\n255'; sed -n 2001,2680p $1.mlc | awk '{
    for (x=5851; x<=6680; x++) {
        if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) printf "000 000 128 "
        else { if ($x==0) printf "000 000 128 "
            if ($x == 01) printf "024 160 064 "
            if ($x == 02) printf "041 216 082 "
            if ($x == 03) printf "156 216 083 "
            if ($x == 04) printf "158 251 183 "
            if ($x == 05) printf "151 202 178 "
            if ($x == 06) printf "193 163 181 "
            if ($x == 07) printf "244 230 206 "
            if ($x == 08) printf "219 240 188 "
            if ($x == 09) printf "249 224 000 "
            if ($x == 10) printf "239 198 160 "
            if ($x == 11) printf "087 153 208 "
            if ($x == 12) printf "246 242 153 "
            if ($x == 13) printf "251 005 000 "
            if ($x == 14) printf "156 168 128 "
            if ($x == 15) printf "250 250 250 "
            if ($x == 16) printf "195 195 195 "
        }
    } print "" }' ) > .pnm 
    convert .pnm -fill white -stroke white -pointsize 30 -gravity NorthEast -annotate 0 "$1" aus$1.png
}

maps() { aus 2001; aus 2019; }

animate() { convert -loop 0 -delay 200 aus*.png animaus.gif; }

count() {
    sed -n 2001,2680p $1.dat | awk 'BEGIN { 
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180 
    } {
        l=(NR+2001)*180/3600-90.025
        A=(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        for (t=2; t<=17; t++)
            for (x=5851; x<=6680; x++) {
                if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) continue
                S[t] += $(17*x+t)/100*A
            }
    } END {
        for (t=2; t<=17; t++) SA+=S[t]
        for (t=2; t<=17; t++) {
            printf "%07.4f\n", S[t]/SA*100
        }
    }' 
}

analyze() { 
    echo 'Evergreen Needleleaf Forest
        Evergreen Broadleaf Forest
        Deciduous Needleleaf Forest
        Deciduous Broadleaf Forest
        Mixed Forests
        Closed Shrubland
        Open Shrublands
        Woody Savannas
        Savannas
        Grasslands
        Permanent Wetlands
        Croplands
        Urban and Built-up
        Cropland/Natural Vegetation Mosaic
        Snow and Ice
        Barren or Sparsely Vegetated' | tr -d '\t' > .type

    count 2001 > .2001
    count 2019 > .2019

    echo
    echo 'Type                                |   2001  |   2019  |  Change |  % Chg' 
    echo '------------------------------------+---------+---------+---------+---------'
    paste -d, .type .2001 .2019 | awk -F, '$2!=0 {
        printf "%-35s | %7.4f | %7.4f | %+7.4f | %+7.2f%\n", $1, $2, $3, $3-$2, ($3/$2-1)*100
    }'
    echo
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Surface Change

NASA provides global land cover classification data:

2011, Source

Unfortunately it stops in 2011. I did a little bit more digging and found a great resource here. What I wanted to do was show surface changes over time. Here’s my result:

Each year column shows coverage by percent, and last column shows the percent change from 2001 to 2019.

No analysis in this post. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/25
# File: landchg.sh
# Run: source landchg.sh; require; download <user> <pass>; prepare; analyze

require() { sudo apt-get install hdf4-tools; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2010.hdf --user=$1 --password=$2 $base/2010.01.01/MCD12C1.A2010001.006.2018053185051.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
    for (i=1; i<=NF; i++) 
        printf "%03d ", $i
    }' | fold -w489600 | awk '{
    for (t=1; t<=17; t++) {
        for (l=0; l<=7199; l++)
            sum += $(17*l+t)
        printf "%.4f ", sum/7200
        sum = 0
    }
    print ""
    }' > $1.lat
}

area() { awk 'BEGIN { a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
    for (l=-89.975; l<=89.975; l+=0.05)
        printf "%.9f\n",(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2/70842.4493856
    }' > .area
}

whole() { paste -d ' ' .area $1.lat | awk '{ 
        for (i=2; i<=NF; i++) t[i] += $1*$i; 
    } END { 
        for(i in t) printf "%.4f ", t[i]
        print "" 
    }'
}

prepare() { parse 2001; parse 2010; parse 2019; }

analyze() { area; echo 'Water
    Evergreen Needleleaf Forest
    Evergreen Broadleaf Forest
    Deciduous Needleleaf Forest
    Deciduous Broadleaf Forest
    Mixed Forests
    Closed Shrubland
    Open Shrublands
    Woody Savannas
    Savannas
    Grasslands
    Permanent Wetlands
    Croplands
    Urban and Built-up
    Cropland/Natural Vegetation Mosaic
    Snow and Ice
    Barren or Sparsely Vegetated' | tr -d '\t' > .type

    whole 2001 | tr ' ' '\n' > .2001
    whole 2010 | tr ' ' '\n' > .2010
    whole 2019 | tr ' ' '\n' > .2019
    echo 'Type                                |  2001  |  2010  |  2019  |   % Chg' 
    echo '------------------------------------+--------+--------+--------+----------'
    paste -d, .type .2001 .2010 .2019 | sed '$d' | awk -F, '{
        printf "%-35s | %6.3f | %6.3f | %6.3f | %+7.3f%\n", $1, $2, $3, $4, ($4/$2-1)*100
    }'
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Us and Enceladus

Enceladus is the 6th largest moon of Saturn. It has the distinction of being the most reflective object in the solar system.

Photo by NASA’s Cassini Probe

The bond albedo of Enceladus is 0.81.

Let’s figure out what the average temperature of Enceladus should be using the standard approach. This is determined by 2 things:

  1. Insolation
  2. Longwave Radiation from Saturn.

The combined formula is:

( ( TSI*(1-Ea)/4 + (TSI*(1-Sa)/(Ds/Rs)^2)/4 )/σ )^0.25

TSI = Total Solar Irradiance

Ea = Enceladus Bond Albedo, Sa = Saturn Albedo

Rs = Saturn Radius, Ds = Distance from Saturn to Enceladus

We use data from here, here, Albedo from [Howett 2010] & Emissivity = 1 from [Howett 2003].

Do the math:

14.82*(1-0.81)/4 + (14.82*(1-0.342)/3.9494^2)/4 =

0.704 + 0.156 = 0.86

(0.86 / 5.67e-8)^0.25 = 62.4K

So 62.4 K should be the average temperature of Enceladus. Is it?

No it’s not.

The average is about 13 K higher. This just goes to show that the standard climate science approach of using greybody calculations is wrong. It’s wrong everywhere except where temperatures accidently correspond.

There is also definitely no explanation for Enceladus’ south pole aside from internal heat.

And if tiny planetary bodies have plenty of leaking internal heat, may not the Earth?

Based on data from previous flybys, which did not show the south pole well, team members expected that the south pole would be very cold, as shown in the left panel. Enceladus is one of the coldest places in the Saturn system because its extremely bright surface reflects 80 percent of the sunlight that hits it, so only 20 percent is available to heat the surface. As on Earth, the poles should be even colder than the equator because the sun shines at such an oblique angle there…

Equatorial temperatures are much as expected, topping out at about 80 degrees Kelvin (-315 degrees Fahrenheit), but the south pole is occupied by a well-defined warm region reaching 85 Kelvin (-305 degrees Fahrenheit). That is 15 degrees Kelvin (27 degrees Fahrenheit) warmer than expected. The composite infrared spectrometer data further suggest that small areas of the pole are at even higher temperatures, well over 110 degrees Kelvin (-261 degrees Fahrenheit). Evaporation of this relatively warm ice probably generates the cloud of water vapor detected above Enceladus’ south pole by several other Cassini instruments.

The south polar temperatures are very difficult to explain if sunlight is the only energy source heating the surface, though exotic sunlight-trapping mechanisms have not yet been completely ruled out. It therefore seems likely that portions of the polar region are warmed by heat escaping from the interior of the moon. This would make Enceladus only the third solid body in the solar system, after Earth and Jupiter’s volcanic moon Io, where hot spots powered by internal heat have been detected.

NASA

Don’t expect NASA to tell you how much Earth’s internal hotspots contribute to recent warming.

-Z

Trend in Global Fires

Climate alarmists claim that an increase in man-made greenhouse gas emission will cause more fires. For example …

Human-induced climate change promotes the conditions on which wildfires depend, increasing their likelihood …

ScienceDaily

Funk … says there is very well documented scientific evidence that climate change has been increasing the length of the fire season, the size of the area burned each year and the number of wildfires.

DW

The clearest connection between global warming and worsening wildfires occurs through increasing evapotranspiration and the vapor-pressure deficit.  In simple terms, vegetation and soil dry out, creating more fuel for fires to expand further and faster.

… Global warming will keep worsening wildfires …

SkepticalScience

Sounds serious. Is it true?

We show that fire weather seasons have lengthened across 29.6 million km2 (25.3%) of the Earth’s vegetated surface, resulting in an 18.7% increase in global mean fire weather season length. We also show a doubling (108.1% increase) of global burnable area affected by long fire weather seasons and an increased global frequency of long fire weather seasons across 62.4 million km2 (53.4%) during the second half of the study period.

— Nature: Climate-induced variations in global wildfire danger from 1979 to 2013

This is just about the most scientific paper I could find on the issue. Why are they obsessed with the length of the fire season? Why can’t they just answer the simple question: Is there more or less fire?

NASA has collected daily data on Active Fires since 2000.

Active Fires, March 2000 [Source]

I downloaded and analyzed all of their Active Fires data. Here’s the result:

Now it all makes sense. Climate scammers need to cherrypick locations and seasons in order to distract from the empirical truth that global fires have been decreasing. Disgusting.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/16
# File: fire.sh
# Run: source fire.sh; require; sets; download; index; plot

require() { sudo apt-get install -y gmt gnuplot; }

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD14A1_E_FIRE&year=$y"
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1 }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ S+=$1*$2 
    } END { printf "%0.4f\n", S }'
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    awk '$2>0.02 {"date +%j -d "$1 | getline t; 
        print substr($1,1,4)+t/365" "$2 }' .csv | gmt gmtregress | tee .trend | sed 1d | tr '\t' ' ' | cut -c-25 > plot.csv
    echo "
        set term png size 740,420
        set key outside top center horizontal
        set ytics format '%4.2f'
        set ytics 0.01; set mytics 5
        set xtics 2; set mxtics 2
        set xrange [2000:2021]
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Active Fires Index' w lines lw 2 lc rgb '#DD0000',\
                     '' u 1:3 t 'Linear Trend'  w lines lw 3 lc rgb '#440000'		
    "| gnuplot > fire.png 
}

Fortunate Global Greening

Update: See new information.

NASA offers a data product called a Vegetation Index. This can be used to track how green the Earth is.

February 2000, [Source]

Although many are familiar with recent global greening, I prefer to always check the source data. And so I downloaded all of their available 16-day-increment data from 2000 to 2021. Here’s my result:

0.0936 --> 0.1029 is +9.94%

10% global greening in 20 years! We are incredibly fortunate!

I just wish everyone felt that way. But you know not everyone does. To the extent that humans enhance global greening is precisely what social parasites want to tax and regulate. No good deed goes unpunished.

Anyway, Enjoy 🙂 -Zoe

P. S. The Earth is ~29% land. A Veg Index of ~0.29 would mean all covered land is heavy vegetation.

Update: See new information.

# Zoe Phin, 2021/02/16
# File: veg.sh
# Run: source veg.sh; sets; download; index; plot

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_NDVI_16&year=$y" 
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
    rm -f 201[789]-12-31.csv
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1
        }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ 
        S+=$1*$2 
    } END { 
        printf "%0.4f\n", S
    }'
}

yoy() {
    cat .csv | cut -c12- | tr '\n' ' ' | awk -vp=$1 '{
        for (i=0;i<p/2;i++) print ""
        for (i=p/2;i<=NF-p/2;i++) { s=0
            for (j=i-p/2; j<=i+p/2; j++)
                s+=$j/(p+1)
            printf "%.4f\n", s
        }
    }' > .yoy
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    yoy 22; paste -d ' ' .csv .yoy > plot.csv
    sed -n '12p;$p' .yoy | tr '\n' ' ' | awk '{printf "%s --> %s, is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
    echo "
        set term png size 740,620
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01
        set mxtics 2
        set mytics 5
        set xrange ['2000-01-01':'2020-12-31']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Vegetation Index ' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t '1-Year Moving Avg' w lines lw 3 lc rgb '#005500'		
    "| gnuplot > veg.png 
}

Average Moon Day and Night Temperatures

NASA’s Moon Fact Sheet doesn’t give the diurnal temperature range for the entire moon, just the equator:

Diurnal temperature range (equator): 95 K to 390 K

Strange. They have collected the data. Why didn’t they do the calculations? So I could do it?

I went through every 15 degree increment longitude data available here.

Day is the center hot spot +/- 90 degrees. Night is everything outside of that.

Here’s my result:

Lon    Day    Night
000: 303.914 099.629 
015: 304.115 099.809 
030: 304.250 099.569 
045: 304.342 099.402 
060: 303.527 099.818 
075: 303.196 099.688 
090: 302.704 099.543 
105: 302.347 099.650 
120: 301.705 099.676 
135: 301.474 099.267 
150: 301.550 099.314 
165: 300.939 099.281 
180: 300.458 099.378 
195: 301.062 099.347 
210: 301.293 099.516 
225: 302.147 099.307 
240: 303.114 099.249 
255: 302.813 099.433 
270: 302.921 099.221 
285: 303.267 099.054 
300: 303.318 099.161 
315: 303.682 099.245 
330: 303.588 099.397 
345: 304.116 099.122 

Avg: 302.743 099.420

Whole Moon:  201.082

As you can see the time-averaged whole moon goes from a nightly low of 99.42 K to a daily high of 302.743 K, with a 24 moon-hour average of 201.082 K.

I assume that both day and night is a 12 moon-hour period. This may not philosophically be so, but my whole purpose was to figure out the difference between light and dark equal-area hemispheres, not compare unequal light to dark areas.

I’ll contact NASA’s Fact Sheet webadmin to ask him to update.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/15
# File: moont.sh
# Run: source moont.sh; download; calc

download() {
    for l in {000..345..15}; do                   
        echo http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.xyz        
    done | wget -ci -
}

avg() {
    awk '{ f=1/581.9; e=2*f-f^2; r=atan2(0,-1)/180
        T[$2]+=$3; N[$2]+=1; A[$2]+=r/1438.355*(1-e)*cos(r*$2)/(1-e*sin(r*$2)^2)^2
    } END {for (L in T) printf "%+06.2f %7.3f %.15f\n", L, T[L]/N[L], A[L]}' | awk '{
        T+=$2*$3 } END { printf "%07.3f ", T }'
}

calc() {
    for l in {000..345..15}; do                   
        echo -n "$l: "
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && ($1>-90 && $1<90 ))			{ print }
        (l=="015" && ($1>-75 && $1<105))			{ print }
        (l=="030" && ($1>-60 && $1<120))			{ print }
        (l=="045" && ($1>-45 && $1<135))			{ print }
        (l=="060" && ($1>-30 && $1<150))			{ print }
        (l=="075" && ($1>-15 && $1<165))			{ print }
        (l=="090" && ($1>0   && $1<180))  			{ print }
        (l=="105" && ($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && ($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && ($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && ($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && ($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && ($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && ($1>105 || $1<-75))			{ print }
        (l=="210" && ($1>120 || $1<-60))			{ print }
        (l=="225" && ($1>135 || $1<-45))			{ print }
        (l=="240" && ($1>150 || $1<-30))			{ print }
        (l=="255" && ($1>165 || $1<-15))			{ print }
        (l=="270" && ($1<0 ))						{ print }
        (l=="285" && ($1<15 && $1>-165))			{ print }
        (l=="300" && ($1<30 && $1>-150))			{ print }
        (l=="315" && ($1<45 && $1>-135))			{ print }
        (l=="330" && ($1<60 && $1>-120))			{ print }
        (l=="345" && ($1<75 && $1>-105))			{ print }
        ' | avg
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && !($1>-90 && $1<90 ))			{ print }
        (l=="015" && !($1>-75 && $1<105))			{ print }
        (l=="030" && !($1>-60 && $1<120))			{ print }
        (l=="045" && !($1>-45 && $1<135))			{ print }
        (l=="060" && !($1>-30 && $1<150))			{ print }
        (l=="075" && !($1>-15 && $1<165))			{ print }
        (l=="090" && !($1>0   && $1<180))  			{ print }
        (l=="105" && !($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && !($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && !($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && !($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && !($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && !($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && !($1>105 || $1<-75))			{ print }
        (l=="210" && !($1>120 || $1<-60))			{ print }
        (l=="225" && !($1>135 || $1<-45))			{ print }
        (l=="240" && !($1>150 || $1<-30))			{ print }
        (l=="255" && !($1>165 || $1<-15))			{ print }
        (l=="270" && !($1<0 ))						{ print }
        (l=="285" && !($1<15 && $1>-165))			{ print }
        (l=="300" && !($1<30 && $1>-150))			{ print }
        (l=="315" && !($1<45 && $1>-135))			{ print }
        (l=="330" && !($1<60 && $1>-120))			{ print }
        (l=="345" && !($1<75 && $1>-105))			{ print }
        ' | avg
        echo
    done | awk '
        BEGIN { print "Lon    Day    Night" }
              { D+=$2; N+=$3; print }
        END   { printf "\nAvg: %07.3f %07.3f\n\nWhole Moon:  %07.3f", D/24, N/24, D/48+N/48}'
}

### Blog Extra ###

require() { sudo apt-get install -y imagemagick; }

dlimgs() {
    for l in {000..345..15}; do                   
        wget -O L$l.png http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.png        
    done
}

scale() { n=0
    for l in {000..345..15}; do                   
        convert -quality 30% -scale 12% L$l.png SL$(printf "%03d" $n).jpg
        let n++
    done
}

animate() { convert -delay 30 -loop 0 SL*.jpg animoon.gif; }

Big Blue Marble

I don’t know about you, but I always thought this was a beautiful image:

Blue Marble ; Source: NASA, 2012, Showing 2004.

I decided to fix it per this article, and make a large (1920×960) animated version. The result is here. It’s 7MB, so please wait for it to load. Right-click and save image in case wordpress is annoying. I made it my wallpaper, and so can you!

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/14
# File: terra.sh
# Run: source terra.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=BlueMarbleNG' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O terra$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=1920&height=960"
        let n++
    done
}

fix() { 
    for jpg in `ls -1 terra*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n1920 960\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 1,3d | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w23040 > .tmp

        cut -c1-756 .tmp > .right
        cut -c756-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 50 -loop 0 terra*.jpg animterra.gif; }

Annual Leaf Cycle

Our Beautiful Living and Breathing Planet

The map is fixed per this article.

# Zoe Phin, 2021/02/13
# File: lai.sh
# To Run: source lai.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD15A2_M_LAI&date=2016-01-01' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O lai$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=750&height=375"
        sleep 1
        let n++
    done
}

fix() { 
    for jpg in `ls -1 lai*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n750 375\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w9000 > .tmp

        cut -c1-300 .tmp > .right
        cut -c300-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 25 -loop 0 lai*.jpg animlai.gif; }

Happy Valentines Day!

❤ -Zoe

Effect of Clouds on Global Upwelling Radiation

I downloaded and analyzed 10 Gigabytes of data fully covering years 2003 to 2019 from “the only project worldwide whose prime objective is to produce global climate data records of ERB [Earth’s Radiation Budget] from instruments designed to observe the ERB” [site] [data] in order to see the effect of clouds at the surface, especially the Upwelling Longwave Radiation (LW_UP).

NASA Reminds us …

High clouds are much colder than low clouds and the surface. They radiate less energy to space than low clouds do. The high clouds in this image are radiating significantly less thermal energy than anything else in the image. Because high clouds absorb energy so efficiently, they have the potential to raise global temperatures. In a world with high clouds, much of the energy that would otherwise escape to space is captured in the atmosphere. High clouds make the world a warmer place. If more high clouds were to form, more heat energy radiating from the surface and lower atmosphere toward space would be trapped in the atmosphere, and Earth’s average surface temperature would climb.

NASA

In contrast to the warming effect of the higher clouds, low stratocumulus clouds act to cool the Earth system. Because lower clouds are much thicker than high cirrus clouds, they are not as transparent: they do not let as much solar energy reach the Earth’s surface. Instead, they reflect much of the solar energy back to space (their cloud albedo forcing is large). Although stratocumulus clouds also emit longwave radiation out to space and toward the Earth’s surface, they are near the surface and at almost the same temperature as the surface. Thus, they radiate at nearly the same intensity as the surface and do not greatly affect the infrared radiation emitted to space (their cloud greenhouse forcing on a planetary scale is small). On the other hand, the longwave radiation emitted downward from the base of a stratocumulus cloud does tend to warm the surface and the thin layer of air in between, but the preponderant cloud albedo forcing shields the surface from enough solar radiation that the net effect of these clouds is to cool the surface.

NASA

Here’s the global percent of clouds by type:

Clouds  Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
Type_1  008.379  007.999  007.839  008.140  008.443  008.367  008.345  008.524  008.550  008.229  008.157  007.984  007.999  008.028  008.256  008.465  009.469  009.641
Type_2  024.677  023.556  023.799  024.149  024.438  024.168  024.382  024.580  024.419  024.181  024.766  024.539  024.534  024.796  025.193  025.493  026.317  026.195
Type_3  036.259  035.815  035.721  035.894  036.028  035.646  036.004  036.248  035.742  035.566  036.363  036.144  036.194  036.563  036.856  036.918  037.531  037.173
Type_4  066.637  067.458  067.597  067.381  066.701  066.395  066.248  066.500  066.579  066.149  066.087  066.093  066.003  066.103  066.577  066.569  067.425  066.972
Type_5  133.275  134.917  135.194  134.763  133.403  132.790  132.496  133.001  133.157  132.298  132.173  132.186  132.007  132.206  133.154  133.139  134.851  133.944

Cloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)

The project keeps track of 4 different types of observed LW_UP: All, Clr, AllNoAero, and Pristine. All is normal observed sky. Clr (clear) is no clouds. AllNoAero is All minus aerosols. Pristine is Clr minus aerosols.

Since clouds play an important role in Earth’s supposed greenhouse effect, and this effect leads to a supposed serious warming at the surface, we should see a very large difference between all these 4 scenarios.

The results (Units = W/m²):

Series               Average     2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_up        397.445  397.191  396.820  397.667  397.222  397.033  396.243  396.924  397.166  396.364  396.883  397.063  397.361  398.266  398.894  398.455  398.166  398.848
all_sfc_lw_up        398.167  397.921  397.559  398.404  397.945  397.750  396.955  397.632  397.876  397.076  397.598  397.795  398.090  398.992  399.625  399.189  398.874  399.551
pristine_sfc_lw_up   397.387  397.135  396.763  397.610  397.165  396.974  396.182  396.866  397.107  396.306  396.825  397.006  397.305  398.207  398.836  398.397  398.106  398.790
allnoaero_sfc_lw_up  398.129  397.885  397.522  398.368  397.907  397.711  396.914  397.594  397.838  397.038  397.560  397.758  398.054  398.953  399.587  399.152  398.834  399.513

But in fact there is very little difference. The difference in surface LW_UP between a Pristine sky (no clouds, no aerosols) and All sky (see above cloud data) is just 0.78 W/m².

I would even argue it might be ZERO. It’s only not zero because a satellite can’t measure the same scenario in the same place at the same time. They can only measure some place nearby or same place at another time. Even if I’m wrong on this, this value is still very unimpressive.

Now let’s look at downwelling longwave radiation (LW_DN) and longwave radiation at the top of the atmosphere (TOA_LW):

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_dn        317.924  317.702  317.175  318.077  317.760  317.364  316.483  317.572  318.370  316.923  317.328  317.615  318.045  319.242  319.663  318.692  318.146  318.559
all_sfc_lw_dn        347.329  347.436  347.344  348.132  347.250  346.673  345.582  346.526  347.440  346.029  346.573  347.385  347.673  348.678  349.256  348.454  346.994  347.173
pristine_sfc_lw_dn   316.207  316.004  315.473  316.394  316.063  315.611  314.691  315.852  316.654  315.192  315.589  315.934  316.384  317.490  317.954  316.968  316.400  316.867
allnoaero_sfc_lw_dn  346.359  346.490  346.395  347.196  346.297  345.669  344.546  345.549  346.466  345.048  345.590  346.448  346.754  347.694  348.296  347.489  345.987  346.195

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_toa_lw_up        262.503  262.373  262.267  262.645  262.446  262.584  262.087  262.268  262.521  262.179  262.185  262.499  262.543  262.668  263.075  262.942  262.535  262.735
all_toa_lw_up        237.889  237.990  237.924  238.257  237.970  238.339  237.685  237.764  238.165  237.975  237.581  237.895  237.973  238.027  237.999  237.848  237.167  237.557
pristine_toa_lw_up   262.979  262.833  262.720  263.102  262.911  263.070  262.598  262.743  262.988  262.665  262.684  262.965  263.009  263.165  263.547  263.419  263.033  263.198
allnoaero_toa_lw_up  238.168  238.260  238.189  238.523  238.242  238.626  237.987  238.042  238.438  238.260  237.874  238.167  238.245  238.320  238.274  238.126  237.456  237.827

Let’s now compare the averages side by side for all 3:

Series               Average

clr_toa_lw_up        262.503
all_toa_lw_up        237.889
pristine_toa_lw_up   262.979
allnoaero_toa_lw_up  238.168

clr_sfc_lw_dn        317.924
all_sfc_lw_dn        347.329
pristine_sfc_lw_dn   316.207
allnoaero_sfc_lw_dn  346.359

clr_sfc_lw_up        397.445
all_sfc_lw_up        398.167
pristine_sfc_lw_up   397.387
allnoaero_sfc_lw_up  398.129

The standard greenhouse effect narrative is that infrared absorbing gases prevent radiation from reaching space and this causes warming at the surface (thus more radiation). Well we clearly see that’s not case. If clouds (water vapor + aerosols) hardly changes outgoing surface radiation, then the whole hypothesis is in error. Less top-of-atmosphere outgoing radiation doesn’t cause surface heating and thus more radiation from the surface, despite the increase in downwelling radiation.

Enjoy 🙂 -Zoe

Update 02/28

Resident Biden’s Senior Climate Advisor reminds us

We quantify the impact of each individual absorber in the total effect by examining the net amount of long‐wave radiation absorbed in the atmosphere (G, global annual mean surface upwelling LW minus the TOA LW upwelling flux) [Raval and Ramanathan, 1989; Stephens and Greenwald, 1991]. This is zero in the absence of any long‐wave absorbers, and around 155 W/m2 in the present‐day atmosphere [Kiehl and Trenberth, 1997]. This reduction in outgoing LW flux drives the 33°C greenhouse effect defined above, and is an easier diagnostic to work with.

Gavin Schmidt et al.

that the greenhouse effect (G) is just SFC_LW_UP minus TOA_LW_UP. So let’s do that for all scenarios:

clr_toa_lw_up        397.445 - 262.503 = 134.942
all_toa_lw_up        398.167 - 237.889 = 160.278
pristine_toa_lw_up   397.387 - 262.979 = 134.408
allnoaero_toa_lw_up  398.129 - 238.168 = 159.961

So there is definitely a mathematical “greenhouse effect” difference between the 4 scenarios, and yet this makes no difference to surface upwelling radiation, and by extension to surface temperature.

Varying the amount of “greenhouse effect” means nothing to surface temperature.

Since the absorption of radiation by IR active gases makes no difference to surface temperature, the greenhouse effect hypothesis is simply incorrect and should be abandoned for the sake of empirical science.

-Zoe

Code clouds.sh:

# Zoe Phin, 2021/02/09

require() { sudo apt install -y hd4-tools; }

download() { 
    mkdir -p ceres; n=4
    for y in {2003..2019}; do 
        for m in {01..12}; do
            [ $y$m -ge 201507 ] && n=5
            [ $y$m -ge 201603 ] && n=6
            [ $y$m -ge 201802 ] && n=7
            wget -O ceres/$y$m.hdf -c "https://opendap.larc.nasa.gov/opendap/hyrax/CERES/SYN1deg-Month/Terra-Aqua-MODIS_Edition4A/$y/$m/CER_SYN1deg-Month_Terra-Aqua-MODIS_Edition4A_40${n}406.$y$m.hdf"
        done
    done
}

cmd() { ncdump-hdf -l999 ceres/$1$2.hdf -v "$3"; }

lwup() { series='init_clr_sfc_lw_up init_all_sfc_lw_up init_pristine_sfc_lw_up init_allnoaero_sfc_lw_up'; lw; }
lwdn() { series='init_clr_sfc_lw_dn init_all_sfc_lw_dn init_pristine_sfc_lw_dn init_allnoaero_sfc_lw_dn'; lw; }
lwta() { series='init_clr_toa_lw_up init_all_toa_lw_up init_pristine_toa_lw_up init_allnoaero_toa_lw_up'; lw; }

lw() {
    printf "\n%-20s %-11s" Series Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    for s in $(echo $series); do
        printf "%-20s = " $s
        for y in {2003..2019}; do
            for m in {01..12}; do
                cmd $y $m ${s}_glob | sed -n 3173,+2p
            done | awk -vv="${s}_glob" '$1==v{s+=$3}END{printf "%07.3f ",s/12}'
        done
        echo
    done | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
    }' | sed -r 's/init_|adj_//' | column -t
}

clouds() {
    rm -f .m* .y* .cld
    printf "\n%-7s %-11s" Clouds Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    printf "Type_%d =\n" $(seq 5) > .cld
    for y in {2003..2019}; do 
        for m in {01..12}; do 
            cmd $y $m obs_cld_amount_glob | sed -n 3173,+2p | grep -o '[0-9].*[0-9]' | tr ',' '\n' > .m$m
        done 
        paste .m* | awk '{ S=0; for(i=1;i<=NF;i++) s+=$i; printf "%07.3f\n", s/12 }' > .y$y
    done
    ( 	paste -d ' ' .cld .y* | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
        }' | column -t
        echo -e '\nCloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)'
    )
}

Run:

$ source clouds.sh; require && download
$ clouds; lwup; lwdn; lwta

Greenhouse Gases as Coolants

There, I said it. Don’t believe me? I will show you …

NASA offers an online tool for measuring the effects of clouds, aerosols, and greenhouse gases.

Set Output to OUTPUT_details. Note the CO2 (ppmv) setting in the bottom left. Click Compute button to express form changes. Output comes below the form, so scroll down. Result:

Purple Ellipse = LWUP @ one meter above surface

I wrote a program to see changes to Upwelling Longwave Radiation (LWUP) at 1 meter above surface under different CO2 ppmv settings and zones. Here is the result:

PPM  Trop   MLS    MLW    SAS    SAW
 15 456.36 421.41 309.39 382.31 246.71 
 30 456.35 421.41 309.41 382.31 246.75 
 60 456.34 421.41 309.43 382.31 246.80 
120 456.33 421.40 309.46 382.31 246.87 
180 456.32 421.40 309.47 382.31 246.91 
240 456.32 421.40 309.49 382.31 246.95 
300 456.31 421.40 309.50 382.31 246.97 
360 456.31 421.40 309.50 382.31 246.99 
420 456.30 421.40 309.51 382.31 247.01 
480 456.30 421.39 309.51 382.31 247.02 
540 456.29 421.39 309.52 382.30 247.03 
600 456.29 421.39 309.52 382.30 247.04 

Units are in W/m²

(Trop=Tropics, MLS=Mid-Latitude Summer, MLW=Mid-Latitude Winter, SAS=Subarctic Summer, SAW=Subarctic Winter)

You see it ? ? ?

NASA’s tool also allows you to edit the atmospheric composition of water vapor, by setting Atmosphere EDIT to Detail.

I automated changes to sea level water vapor content while maintaining same CO2 level (410 ppm) and same zone (Mid-Latitude Summer). Result:

0.001 423.39 
0.002 423.39 
0.004 423.39 
0.010 423.39 
0.020 422.07 
0.040 421.78 
0.100 421.31 
0.200 421.13 
0.400 421.24

Anyway, that’s all the time I have for now. -Zoe

Update 02/08

While my analysis for CO2 is correct, it appears my H2O analysis was too simplistic. I have re-written the code. What I do use is use all 5 climate zones and change water vapor content in the whole atmospheric column, not just near the surface. I divide original content by 2, 4, 8, 16 and then multiply by same.

New Result:

  WV-X   Trop   MLS    MLW    SAS    SAW
0.0625X 455.63 420.61 308.94 381.72 246.23 
 0.125X 455.73 420.76 309.09 381.86 246.42 
  0.25X 455.83 420.91 309.24 381.98 246.61 
   0.5X 456.01 421.09 309.37 382.11 246.80 
     1X 456.30 421.40 309.51 382.31 247.01 
     2X 456.40 421.70 309.70 382.59 247.22 
     4X 456.02 421.64 309.95 382.65 247.48 
     8X 455.53 421.31 310.08 382.33 247.82 
    16X 455.41 421.12 309.91 381.96 248.11  

There is now warming in every zone but the tropics. No problem … The extra energy needed to raise water vapor content is exactly what these calculations perform. What you’re seeing is new raised fluxes needed to raise WV content.

Apologies if you feel the title of this article is now misleading. I strive for truth and accuracy.

Another Update 02/08 🙂

I updated the code to check every spectral type, not just the Ocean. The function h2o_diff tracks changes of the effects of multiplying water vapor by 256 times. Here is the result:

   Type  Trop   MLS    MLW    SAS    SAW
    01  -1.31  -0.59  -0.22  -0.93  +0.82
    02  -1.31  -0.59  -0.22  -0.93  +0.82
    03  +0.28  +0.85  +0.80  +0.39  +1.52
    04  +0.28  +0.85  +0.80  +0.39  +1.52
    05  -0.54  +0.10  +0.27  -0.29  +1.15
    06  +2.59  +2.98  +2.45  +2.42  +2.75
    07  +8.74  +8.61  +6.77  +7.76  +5.92
    08  -0.45  +0.17  +0.30  -0.24  +1.15
    09  -0.45  +0.17  +0.30  -0.24  +1.15
    10  -0.45  +0.17  +0.30  -0.24  +1.15
    11  -0.32  +0.36  +0.64  +0.01  +1.52
    12  -0.45  +0.17  +0.30  -0.24  +1.15
    13  -2.47  -1.63  -0.94  -1.90  +0.32
    14  -0.47  +0.16  +0.30  -0.24  +1.17
    15  -2.44  -1.60  -0.91  -1.86  +0.35
    16 +11.84 +11.45  +8.95 +10.45  +7.52
    17  -0.22  +0.51  +0.97  +0.24  +1.88

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

I did the same for CO2 (co2_diff):

   Type  Trop   MLS    MLW    SAS    SAW
    01  -0.09  -0.06  -0.06  -0.09  +0.10
    02  -0.09  -0.06  -0.06  -0.09  +0.10
    03  -0.07  -0.03  +0.00  -0.05  +0.15
    04  -0.07  -0.03  +0.00  -0.05  +0.15
    05  -0.07  -0.04  -0.03  -0.07  +0.13
    06  -0.04  +0.01  +0.11  +0.01  +0.30
    07  +0.02  +0.11  +0.41  +0.17  +0.63
    08  -0.08  -0.04  -0.05  -0.07  +0.11
    09  -0.08  -0.04  -0.05  -0.07  +0.11
    10  -0.08  -0.04  -0.05  -0.07  +0.11
    11  -0.07  -0.03  +0.05  -0.04  +0.23
    12  -0.08  -0.04  -0.05  -0.07  +0.11
    13  -0.10  -0.07  -0.10  -0.12  +0.06
    14  -0.07  -0.04  -0.03  -0.07  +0.12
    15  -0.10  -0.08  -0.09  -0.11  +0.07
    16  +0.06  +0.16  +0.55  +0.25  +0.80
    17  -0.07  -0.02  +0.13  -0.01  +0.33

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

Code rtransfer.sh:

# Zoe Phin, v2.2: 2021/02/08

url='https://cloudsgate2.larc.nasa.gov/cgi-bin/fuliou/runfl.cgi?CASE=A
&Compute=Compute&ID=014605%0D%0A&DOUT=F&FOUT=1
&SELOUT=OUTPUT_details
&ATM=mls.atm&EATM=No
&CZA=0.5&VZA=1.0
&STREAM=GWTSA&SFCALB=IGBP
&SFCTYPE=17
&FOAM=OFF&WIND=5.0
&CF3=0.0&CHL=0.1
&CF1=1.0&COD1=1.0&CLDTOP1=250&CLDBOT1=300&PHASE1=ICE&CLDPART1=60&CINH1=100
&CF2=0.0&COD2=10.0&CLDTOP2=850&CLDBOT2=900&PHASE2=WATER&CLDPART2=20&CINH2=100
&AOT1=0.20&AOTTYPE1=continental&AOTSH1=4
&AOT2=0.00&AOTTYPE2=0.5_dust_l2004&AOTSH2=1
&CONT=2.1_ckd&ELEV=0.0
&RES=HI
&CO2=X'

types() { echo '
     1 Evergreen Needle Forest   11 Wetlands
     2 Evergreen Broad Forest    12 Crops
     3 Deciduous Needle Forest   13 Urban
     4 Deciduous Broad Forest    14 Crop/Mosaic
     5 Mixed Forest              15 Permanent Snow
     6 Closed Shrub              16 Barren / Desert
     7 Open Shrub                17 Ocean
     8 Woody Savanna             18 Tundra
     9 Savanna                   19 Fresh Snow
    10 Grassland                 20 Sea Ice
    ' | tr -d '\t'
}

co2() {
    echo "PPM  Trop   MLS    MLW    SAS    SAW"
    for ppm in 15 30 60 120 180 240 300 360 420 480 540 600; do
        printf "%3d " $ppm
        for zone in trop mls mlw sas saw; do
            echo $url | sed "s/ //g; s/CO2=X/CO2=$ppm/; s/ATM=mls/ATM=$zone/" | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done 
}

co2_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        T=$(printf "%02d" $t)
        sed -n "/Type $T/,/^$/p" co2.csv | sed -n '3,14p' | cut -c4- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

h2o() {
	for atm in trop mls mlw sas saw; do
		echo $url | sed "s/ //g; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/" | wget -qO- -i- | sed -n '/<textarea /,/\/textarea>/p;' | sed '1d;$d' > $atm.prof
	done

    echo "  WV-X   Trop   MLS    MLW    SAS    SAW"
    for w in 0.0625 0.125 0.25 0.5 1 2 4 8 16; do
        printf "%6gX " $w
        for zone in trop mls mlw sas saw; do
            atmo=$(awk -vw=$w '{printf "%-7G %8.4f %13G %13G%0D%0A\n", $1, $2, $3*w, $4}' $zone.prof | tr ' ' '+')

            (echo $url | sed "s/ //g; s/CO2=X/CO2=410/; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/"; 
            echo "&ATMOSPHERE=$atmo") | tr -d '\n' | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done | tee h2o.csv
}

h2o_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        sed -n "/Type $t/,/^$/p" h2o.csv | sed -n '3,11p' | cut -c9- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

Run it:

$ . rtransfer.sh; co2
$ . rtransfer.sh; h2o


$ . rtransger.sh; co2_diff  # (must be run after co2)
$ . rtransger.sh; h2o_diff  # (must be run after h2o)

Automated Twit

I needed to create a throwaway Twitter account for a research project. I decided to automate its creation and share the code with you 🙂

I used the handy service of 10minutemail for receiving Twitter’s verification code.

I used the wonderful browser automation tool Nightmare.

Automation in action:

Patiently wait 10 seconds in the beginning and to receive Verification code

Code twit.sh:

# Zoe Phin, 2021/01/28

require() { sudo apt-get -y install npm && npm install nightmare; }

newtwit() { echo "nm = require('nightmare')
    main().catch(e=>{console.log('done.')})
    async function main() {
        e = nm({show: false})
        await e.goto('https://10minutemail.com').wait(2000)

        email = await e.evaluate( ()=> {return document.querySelector('input').value} )
        console.log(email)

        n = nm({show: true}).viewport(740,680)
        await n.goto('https://twitter.com/i/flow/signup').wait(6000)

        await n.insert('input[name=name]','Unique Snowflake')

        await n.evaluate( ()=> { 
            document.querySelectorAll('div[role=button]')[1].children[0].click() } )

        await n.insert('input[name=email]', email)

        await n.select('#Month','1').select('#Day','29').select('#Year','1999')

        await n.type('body','\t\t\t\t\t\t ')
        await n.type('body','\t\t\t \t\t\t ')
        await n.type('body','\t\t\t\t\t\t\t\t\t ')
 
        vcode = await e.wait(10000).evaluate( ()=> {
            return document.querySelector('div.small_subject').innerText.substr(0,6) })

        await n.insert('input', vcode).type('body','\t\t\t ')
        console.log(vcode)
        
        await n.wait(2000).type('input[name=password]', 'Un1qu3 Sn0wfl4k3!')
        await n.wait(1000).type('body','\t\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(2000).type('body','\t\t ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body','\t ')
    //	await n.wait(5000).end()
    } 
    " | node 
}

Setup NodeJS and Nightmare:

$ . twit.sh; require

Run:

$ . twit.sh; newtwit

Note: As I’m not an adept browser-bot coder this code may fail once or twice before working. Just run it until it does. Hopefully someone with more time can fix it. It’s good enough for me.

Enjoy 🙂 -Zoe

P.S. Did you notice Twitter is pushing POTUS as the most important person to follow?

My research featured in a video

My previous article was featured in a video. Thank you Near Death Experiment ! The quality is excellent 🙂

Unfortunately I did not track it in time to watch youtube remove the million likes it got 😉

-Zoe

Update 03/01

Looks like Youtube cancelled my kind fan Near Death Experiment and removed the video with ~10K views.

It’s still available here:

White House Youtube Dislike Manipulation

I’ve seen screenshots of YouTube modifying dislikes of White House videos. I decided I would do a thorough analysis myself. I wrote a script to check video stats every 80 seconds for 24 hours – for all videos on White House’s YouTube channel.

The collected data is archived here and here. The format is space-separated “CSV”, as follows:

VideoURL UnixTimestamp Date,Time Views Likes Dislikes

Here is a sample of the most egregious manipulation:

Some videos were delisted in minutes!:

https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771197 01/27/2021,13:13:17      1227       437      2963
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771285 01/27/2021,13:14:45      1463       441      2999
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771372 01/27/2021,13:16:12      1763       449      3030
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771459 01/27/2021,13:17:39      2476       455      3060
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771546 01/27/2021,13:19:06      2640       459      3098
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771720 01/27/2021,13:22:00      3588       470      3183
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699362 01/26/2021,17:16:02       918       405      4942
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699448 01/26/2021,17:17:28      1202       412      4976
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699534 01/26/2021,17:18:54      1375       415      5026
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766646 01/27/2021,11:57:26       255       375      1771
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766733 01/27/2021,11:58:53       455       380      1823
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766819 01/27/2021,12:00:19       455       383      1852
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766906 01/27/2021,12:01:46       819       387      1886
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766992 01/27/2021,12:03:12      1148       393      1932
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767079 01/27/2021,12:04:39      1462       397      1971
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767166 01/27/2021,12:06:06      1830       398      2019
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770591 01/27/2021,13:03:11      1587        83      2040
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770764 01/27/2021,13:06:04      3014        95      2114

Likes+Dislikes was greater than views in some cases. Although that seems impossible, Youtube updates views slower, so they do not reflect real views at the time. For example:

https://www.youtube.com/watch?v=jw1_00uI02U 1611720090 01/26/2021,23:01:30     44404       924      8099
https://www.youtube.com/watch?v=jw1_00uI02U 1611720176 01/26/2021,23:02:56     44404       924      8118
https://www.youtube.com/watch?v=jw1_00uI02U 1611720260 01/26/2021,23:04:20     44404       925      8132
https://www.youtube.com/watch?v=jw1_00uI02U 1611720345 01/26/2021,23:05:45     44404       925      8151
https://www.youtube.com/watch?v=jw1_00uI02U 1611720429 01/26/2021,23:07:09     44404       925      8168
https://www.youtube.com/watch?v=jw1_00uI02U 1611720514 01/26/2021,23:08:34     44556       925      8184
https://www.youtube.com/watch?v=jw1_00uI02U 1611720599 01/26/2021,23:09:59     44556       925      8199
https://www.youtube.com/watch?v=jw1_00uI02U 1611720683 01/26/2021,23:11:23     44556       928      8219
https://www.youtube.com/watch?v=jw1_00uI02U 1611720768 01/26/2021,23:12:48     44556       928      8237

So it’s possible for likes and dislikes to accumulate while views stays the same. Eventually, views jumps up to better reflect reality.

The record of every time dislikes were removed is archived at https://pastebin.com/raw/F4ELDc4R

Grand Total         -130321

130 Thousand dislikes were removed in a 24hr period!

And this is for the most popular US President of all time!

Enjoy 🙂 -Zoe


Update

This research was featured in a youtube video.


Code wh.sh:

# Zoe Phin, 2021/01/26

require() { sudo apt-get install -y curl gnuplot; }

stats() {
    list=$(curl -s 'https://www.youtube.com/c/WhiteHouse/videos' | grep -o 'watch?v=[^"]*')
    for i in $list; do
        link="https://www.youtube.com/$i"
        date=$(date +"%s %x,%R:%S" | tr -d '\n')
        curl -s $link | tr -d ',' | tr '}' '\n' > new 
        grep -m 1 -o '[0-9,]* views' new > .views
        grep -m 1 -o '[0-9,]* likes' new > .likes
        grep -m 1 -o '[0-9,]* dislikes' new  > .dislikes

        paste .views .likes .dislikes | awk -vL=$link -vD="$date" '
            NF==6{printf "%s %s %9s %9s %9s\n", L, D, $1, $3, $5}'
    done
}

collect() {
    while true; do
        stats; sleep 75
    done | tee -a data.csv
}

dislikes() {
    list=$(cut -c1-44 data.csv | sort -u)

    for vid in $list; do	
        echo $vid
        grep ^$vid data.csv | awk '{
            DiffD=$6-D
            if (DiffD < 0) { 
                printf "%s %+7d\n", $3, DiffD 
                DLost+=DiffD
            }
            D=$6
        } END {
            printf "%-19s %7d\n", "Total", DLost
        }' 
        echo
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() {
    list=$(cut -c1-44 data.csv | sort -u)
    let n=0

    for vid in $list; do	
        let n++
        awk -vV=$vid '$1==V {print $2" "$4" "$5" "$6}' data.csv > plot.csv

        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title '$vid'
        set timefmt '%s'
        set xdata time
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Run:

$ source wh.sh; require

Collect data:

$ collect

( press Ctrl-C when you're done )

Record of dislike drops:

$ dislikes

Generate charts:

$ plot

Version 2.0: Cleaner and grabs stats at random 60 to 120 second interval.

# Zoe Phin, v2.0 - 2021/02/20

require() { sudo apt-get install -y gnuplot; }

collect() {
    url="https://www.youtube.com"
    while true; do
        for vid in $(wget -qO- "$url/c/WhiteHouse/videos" | grep -o 'watch?v=[^"]*'); do
            wget -qO- $url/$vid | egrep -o '[0-9,]* (views|likes|dislikes)' |\
            sed -n 1~2p | tr -d '[:alpha:],\n' |\
            awk -vL=$url/$vid -vD="$(date +"%s %x,%R:%S" | tr -d '\n')" '
                NF==3 { printf "%s %s %9s %9s %9s\n", L, D, $1, $2, $3 }'
        done
        sleep $(seq 60 120 | shuf | head -1)
    done | tee -a data.csv
}

dislikes() {
    for vid in $(cut -c1-44 data.csv | sort -u); do	
        awk -vv=$vid 'BEGIN { print v } $1==v { 
            Diff=$6-Last
            if (Diff < 0) printf "%s %+7d\n", $3, Lost+=Diff 
            Last=$6
        } END {
            printf "%-19s %7d\n\n", "Total", Lost
        }' data.csv
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() { n=0
    for vid in $(cut -c1-44 data.csv | sort -u); do	let n++
        awk -vv=$vid '$1==v {print $2" "$4" "$5" "$6}' data.csv > plot.csv
        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title noenhanced '$vid'
        set xdata time
        set timefmt '%s'
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Something Rotten in Georgia

Updated – 2021/01/06, 06:50 PM EST

The results of the Georgia runoff election results do not make any logical sense to me. In the last 2 months I have probably seen political ads over 1000 times! No exaggeration. All for the 2 senate seats. There was not a single ad for Public Service Commission District 4, and yet the Republican running for this seat got more votes than Republicans running for either senate seat:

You can also grab NY Times’ data using this Linux one-liner (assuming you have curl and jq installed):

$ curl -s https://static01.nyt.com/elections-assets/2020/data/api/2021-01-05/state-page/georgia.json | jq -Mc '.data.races[].candidates[]|[.votes,.last_name,.party_id]' | tr -d '["]'

The result matches official GA site:

2272277,Warnock,democrat
2189111,Loeffler,republican
2253203,Ossoff,democrat
2208129,Perdue,republican
2227870,McDonald,republican
2185670,Blackman,democrat

McDonald got ~15K more votes than Perdue, and ~38K more than Loeffler.

So the question is: How did this happen? How did Republicans manage to vote more for a less important race?

Do you really believe Republicans would vote more for Public Service Commission District 4 than two senate seats ???

No way!

It sure smells like fraud. As if ballots were thrown out … or switched to Democrats.

Also, the Democrat for Commission District 4 got less votes than other Democrats. As if many fake ballots were produced rapidly just for the senate seats, and perpetrators didn’t have time to fill in this seat.

How about November 2020 senate election data? I combine both senate races:

$ curl -s 'https://static01.nyt.com/elections-assets/2020/data/api/2020-11-03/state-page/georgia.json' | jq -Mc '.data.races[1,2].candidates[]|[.party_id,.last_name,.votes]' | tr -d '["]' | tee november.csv

republican,Perdue,2462617
democrat,Ossoff,2374519
libertarian,Hazel,115039
write-ins,Write-ins,265
democrat,Warnock,1617035
republican,Loeffler,1273214
republican,Collins,980454
democrat,Jackson,324118
democrat,Lieberman,136021
democrat,Johnson-Shealey,106767
democrat,James,94406
republican,Grayson,51592
democrat,Slade,44945
republican,Jackson,44335
republican,Taylor,40349
republican,Johnson,36176
libertarian,Slowinski,35431
democrat,Winfield,28687
democrat,Tarver,26333
independent,Buckley,17954
green,Fortuin,15293
independent,Bartell,14640
independent,Stovall,13318
independent,Greene,13293
write-ins,Write-ins,34

I open the results in Excel, and try to combine data into Left and Right candidates. Republican and Libertarian is obviously Right. Democrat and Green are obviously Left. I give the Left a huge boost by including Independents with the Left, and despite this …

The Right Wins. What’s surprising is that the Right lost both Senate Runoff Elections. Why?

Fraud!

Something is very rotten in the state of Georgia!

-Zoe

If you missed it: https://phzoe.com/2021/01/06/georgia-senate-runoff-timestamp-data/

Georgia Senate Runoff Timestamp Data

I was trying to find timestamp data for the Georgia 2021 Senate Runoff election. I couldn’t find it easily via a google search, but I kept digging, and managed to find it and extract it from NY Times’ live prediction data feed. Here it is …

Senate Race 1
Senate Race 2

Code … georgia.sh:

# Zoe Phin, 2021/01/06

require() { sudo apt-get -y install curl jq gnuplot; }

download() { 
    curl -o ga1.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-G-S-2021-01-05.json
    curl -o ga2.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-S-S-2021-01-05.json
}

timeseries() {
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga1.json | tr -d '["]' > ga1.csv
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga2.json | tr -d '["]' > ga2.csv
}

format() {
    for i in 1 2; do
        (echo "Timestamp            Votes  Rep %   Dem %   Rep     Dem"
         awk -F, '{ "TZ=US/Eastern date +%x,%R:%S -d "$1 | getline t; printf "%s %7d %6.4f %6.4f %7d %7d\n", t, $2, $3, $4, $2*$3, $2*$4 }' ga$i.csv
        ) > ga$i.txt
    done
}

plot() {
    awk -F, '{ "TZ=US/Eastern date +%d%H%M%S -d "$1 | getline t; printf "%s %7d %7d\n", t, $2*$3, $2*$4 }' ga$1.csv > ga$1.dat
    (echo 'set term png size 640,480
    set key top left
    set grid xtics ytics
    set ylabel "Million Votes"
    set timefmt "%d%H%M%S"
    set xdata time
    set xtics format "01/%d\n%H:%M"
    set ytics format "%.1f"
    set mytics 5'
    echo "plot 'ga${1}.dat' u 1:(\$2/1e6) t '$2' w lines lc rgb 'red','' u 1:(\$3/1e6) t '$3' w lines lc rgb 'blue'"
    ) | gnuplot > ga$1.png
}

Run it:

$ source georgia.sh; require; download; timeseries; format

format generates timestamp data into two files: ga1.txt and ga2.txt. The results are archived here and here, respectively.

Race 1 is Perdue vs. Ossoff, and Race 2 is Loeffler vs. Warnock

To plot the data:

$ plot 1 Perdue Ossoff
$ plot 2 Loeffler Warnock

This generates ga1.png and ga2.png, which I present above.

I left my opinion out of this post. Curious Windows coders should follow instructions here.

Enjoy the data 🙂 -Zoe

Heat flux in the Sun

The sun is known to emit ~63 Mega Watts per meter squared from its photosphere. But what is the heat flux inside this emissive photosphere?

Source

Heat flux formula: q = k*ΔT/L

q = k * (6600-4400 Kelvin) / (500,000 meters)

What is the thermal conductivity (k) value of hydrogen at these temperatures? [1]

This is actually very difficult to find, but I managed to find something:

Thermal Conductivity of Hydrogen, Source, Figure 5

This y-axis needs to be divided by 10 to get units (W/m*K).

The range of pressure in the photosphere is: 0.0009 atm to 0.123 atm. I think it’s safe to say that thermal conductivity of hydrogen is definitely no more than 2.5 W/m*K in our desired range. That will be our upper limit. Thus,

q = 2.5 * 2200 / 500000 = 0.011 W/m² [2]

As you can you can see, there is no problem with 0.011 W/m² “supporting” a 63 MW/m² output.

My critics will be quick to point out that I can’t use the conduction formula because the sun only has radiative transfers in the photosphere. But that’s just their excuse for NEVER figuring out what the internal heat flow is. Any of their attempts at doing so will be embarrassing for them, and so they will avoid it at all cost. Surely there is a heat flux, and surely it can be figured out.

My critics believe in conservation of heat flow, that is: internal heat flux => emergent radiation. There must be equality of these two things, otherwise there will be rapid cooling. Well, the sun has had 4.5 billion years to reach their “steady-state” equilibrium nonsense and it’s nowhere close. Maybe despite all their chest thumping, they have no idea what they’re talking about?

What goes for the sun here goes for my geothermal theory as well.

Just as <0.011 W/m² internal heat flux can “support” a 63 MW/m² emission from the sun, so too can a ~0.9 W/m² geothermal heat flux “support” a ~334 W/m² emission from Earth’s surface.

And why is that? See here and here.

Think about it!

Enjoy 🙂 -Zoe

Note:

[1] I left out helium. I don’t care to include it, as it makes little difference. I only care about being right within an order of magnitude.

[2] I don’t include surface area of emission, because the difference in solar radius of top and bottom of photosphere is too small.