As part of the University of North Carolina BIOL222 class, Dr. Catherine Kehl asked her students to “use `matplotlib.pyplot`

to make art.” BIOL222 is Introduction to Programming, aimed at students with no programming background. The emphasis is on practical, hands-on active learning.

The students completed the assignment with festive enthusiasm around Halloween. Here are some great examples:

Harris Davis showed an affinity for pumpkins, opting to go 3D!

```
# get library for 3d plotting
from mpl_toolkits.mplot3d import Axes3D
# make a pumpkin :)
rho = np.linspace(0, 3 * np.pi, 32)
theta, phi = np.meshgrid(rho, rho)
r, R = 0.5, 0.5
X = (R + r * np.cos(phi)) * np.cos(theta)
Y = (R + r * np.cos(phi)) * np.sin(theta)
Z = r * np.sin(phi)
# make the stem
theta1 = np.linspace(0, 2 * np.pi, 90)
r1 = np.linspace(0, 3, 50)
T1, R1 = np.meshgrid(theta1, r1)
X1 = R1 * 0.5 * np.sin(T1)
Y1 = R1 * 0.5 * np.cos(T1)
Z1 = -(np.sqrt(X1**2 + Y1**2) - 0.7)
Z1[Z1 < 0.3] = np.nan
Z1[Z1 > 0.7] = np.nan
# Display the pumpkin & stem
fig = plt.figure()
ax = fig.gca(projection="3d")
ax.set_xlim3d(-1, 1)
ax.set_ylim3d(-1, 1)
ax.set_zlim3d(-1, 1)
ax.plot_surface(X, Y, Z, color="tab:orange", rstride=1, cstride=1)
ax.plot_surface(X1, Y1, Z1, color="tab:green", rstride=1, cstride=1)
plt.show()
```

Bryce Desantis stuck to the biological theme and demonstrated fractal art.

```
import numpy as np
import matplotlib.pyplot as plt
# Barnsley's Fern - Fractal; en.wikipedia.org/wiki/Barnsley_…
# functions for each part of fern:
# stem
def stem(x, y):
return (0, 0.16 * y)
# smaller leaflets
def smallLeaf(x, y):
return (0.85 * x + 0.04 * y, -0.04 * x + 0.85 * y + 1.6)
# large left leaflets
def leftLarge(x, y):
return (0.2 * x - 0.26 * y, 0.23 * x + 0.22 * y + 1.6)
# large right leftlets
def rightLarge(x, y):
return (-0.15 * x + 0.28 * y, 0.26 * x + 0.24 * y + 0.44)
componentFunctions = [stem, smallLeaf, leftLarge, rightLarge]
# number of data points and frequencies for parts of fern generated:
# lists with all 75000 datapoints
datapoints = 75000
x, y = 0, 0
datapointsX = []
datapointsY = []
# For 75,000 datapoints
for n in range(datapoints):
FrequencyFunction = np.random.choice(componentFunctions, p=[0.01, 0.85, 0.07, 0.07])
x, y = FrequencyFunction(x, y)
datapointsX.append(x)
datapointsY.append(y)
# Scatter plot & scaled down to 0.1 to show more definition:
plt.scatter(datapointsX, datapointsY, s=0.1, color="g")
# Title of Figure
plt.title("Barnsley's Fern - Assignment 3")
# Changing background color
ax = plt.axes()
ax.set_facecolor("#d8d7bf")
```

Grace Bell got a little trippy with this rotationally semetric art. It’s pretty cool how she captured mouse events. It reminds us of a flower. What do you see?

```
import matplotlib.pyplot as plt
from matplotlib.tri import Triangulation
from matplotlib.patches import Polygon
import numpy as np
# I found this sample code online and manipulated it to make the art piece!
# was interested in because it combined what we used for functions as well as what we used for plotting with (x,y)
def update_polygon(tri):
if tri == -1:
points = [0, 0, 0]
else:
points = triang.triangles[tri]
xs = triang.x[points]
ys = triang.y[points]
polygon.set_xy(np.column_stack([xs, ys]))
def on_mouse_move(event):
if event.inaxes is None:
tri = -1
else:
tri = trifinder(event.xdata, event.ydata)
update_polygon(tri)
ax.set_title(f"In triangle {tri}")
event.canvas.draw()
# this is the info that creates the angles
n_angles = 14
n_radii = 7
min_radius = 0.1 # the radius of the middle circle can move with this variable
radii = np.linspace(min_radius, 0.95, n_radii)
angles = np.linspace(0, 2 * np.pi, n_angles, endpoint=False)
angles = np.repeat(angles[..., np.newaxis], n_radii, axis=1)
angles[:, 1::2] += np.pi / n_angles
x = (radii * np.cos(angles)).flatten()
y = (radii * np.sin(angles)).flatten()
triang = Triangulation(x, y)
triang.set_mask(
np.hypot(x[triang.triangles].mean(axis=1), y[triang.triangles].mean(axis=1))
< min_radius
)
trifinder = triang.get_trifinder()
fig, ax = plt.subplots(subplot_kw={"aspect": "equal"})
ax.triplot(
triang, "y+-"
) # made the color of the plot yellow and there are "+" for the data points but you can't really see them because of the lines crossing
polygon = Polygon([[0, 0], [0, 0]], facecolor="y")
update_polygon(-1)
ax.add_patch(polygon)
fig.canvas.mpl_connect("motion_notify_event", on_mouse_move)
plt.show()
```

As a bonus, did you like that fox in the banner? That was created (and well documented) by Emily Foster!

```
import numpy as np
import matplotlib.pyplot as plt
plt.axis("off")
# head
xhead = np.arange(-50, 50, 0.1)
yhead = -0.007 * (xhead * xhead) + 100
plt.plot(xhead, yhead, "darkorange")
# outer ears
xearL = np.arange(-45.8, -9, 0.1)
yearL = -0.08 * (xearL * xearL) - 4 * xearL + 70
xearR = np.arange(9, 45.8, 0.1)
yearR = -0.08 * (xearR * xearR) + 4 * xearR + 70
plt.plot(xearL, yearL, "black")
plt.plot(xearR, yearR, "black")
# inner ears
xinL = np.arange(-41.1, -13.7, 0.1)
yinL = -0.08 * (xinL * xinL) - 4 * xinL + 59
xinR = np.arange(13.7, 41.1, 0.1)
yinR = -0.08 * (xinR * xinR) + 4 * xinR + 59
plt.plot(xinL, yinL, "salmon")
plt.plot(xinR, yinR, "salmon")
# bottom of face
xfaceL = np.arange(-49.6, -14, 0.1)
xfaceR = np.arange(14, 49.3, 0.1)
xfaceM = np.arange(-14, 14, 0.1)
plt.plot(xfaceL, abs(xfaceL), "darkorange")
plt.plot(xfaceR, abs(xfaceR), "darkorange")
plt.plot(xfaceM, abs(xfaceM), "black")
# nose
xnose = np.arange(-14, 14, 0.1)
ynose = -0.03 * (xnose * xnose) + 20
plt.plot(xnose, ynose, "black")
# whiskers
xwhiskR = [50, 70, 55, 70, 55, 70, 49.3]
xwhiskL = [-50, -70, -55, -70, -55, -70, -49.3]
ywhisk = [82.6, 85, 70, 65, 60, 45, 49.3]
plt.plot(xwhiskR, ywhisk, "darkorange")
plt.plot(xwhiskL, ywhisk, "darkorange")
# eyes
plt.plot(20, 60, color="black", marker="o", markersize=15)
plt.plot(-20, 60, color="black", marker="o", markersize=15)
plt.plot(22, 62, color="white", marker="o", markersize=6)
plt.plot(-18, 62, color="white", marker="o", markersize=6)
```

We look forward to seeing these students continue in their plotting and scientific adventures!

]]>The IPCC’s *Special Report on Global Warming of 1.5°C* (SR15), published in October 2018,
presented the latest research on anthropogenic climate change.
It was written in response to the 2015 UNFCCC’s “Paris Agreement” of

holding the increase in the global average temperature to well below 2 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C […]".

cf. Article 2.1.a of the Paris Agreement

As part of the SR15 assessment, an ensemble of quantitative, model-based scenarios was compiled to underpin the scientific analysis. Many of the headline statements widely reported by media are based on this scenario ensemble, including the finding that

global net anthropogenic CO2 emissions decline by ~45% from 2010 levels by 2030

in all pathways limiting global warming to 1.5°C
(cf. statement C.1 in the *Summary For Policymakers*).

When preparing the SR15, the authors wanted to go beyond previous reports not just regarding the scientific rigor and scope of the analysis, but also establish new standards in terms of openness, transparency and reproducibility.

The scenario ensemble was made accessible via an interactive *IAMC 1.5°C Scenario Explorer*
(link) in line with the
FAIR principles for scientific data management and stewardship.
The process for compiling, validating and analyzing the scenario ensemble
was described in an open-access manuscript published in *Nature Climate Change*
(doi: 10.1038/s41558-018-0317-4).

In addition, the Jupyter notebooks generating many of the headline statements, tables and figures (using Matplotlib) were released under an open-source license to facilitate a better understanding of the analysis and enable reuse for subsequent research. The notebooks are available in rendered format and on GitHub.

To facilitate reusability of the scripts and plotting utilities
developed for the SR15 analysis, we started the open-source Python package **pyam**
as a toolbox for working with scenarios from integrated-assessment and energy system models.

The package is a wrapper for pandas and Matplotlib geared for several data formats commonly used in energy modelling. Read the docs!

]]>Earth’s temperatures are rising and nothing shows this in a simpler, more approachable graphic than the “Warming Stripes”. Introduced by Prof. Ed Hawkins they show the temperatures either for the global average or for your region as colored bars from blue to red for the last 170 years, available at #ShowYourStripes.

The stripes have since become the logo of the Scientists for Future. Here is how you can recreate this yourself using Matplotlib.

We are going to use the HadCRUT4 dataset, published by the Met Office. It uses combined sea and land surface temperatures. The dataset used for the warming stripes is the annual global average.

First, let’s import everything we are going to use. The plot will consist of a bar for each year, colored using a custom color map.

```
import matplotlib.pyplot as plt
from matplotlib.patches import Rectangle
from matplotlib.collections import PatchCollection
from matplotlib.colors import ListedColormap
import pandas as pd
```

Then we define our time limits, our reference period for the neutral color and the range around it for maximum saturation.

```
FIRST = 1850
LAST = 2018 # inclusive
# Reference period for the center of the color scale
FIRST_REFERENCE = 1971
LAST_REFERENCE = 2000
LIM = 0.7 # degrees
```

Here we use pandas to read the fixed width text file, only the first two columns, which are the year and the deviation from the mean from 1961 to 1990.

```
# data from
# https://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.6.0.0.annual_ns_avg.txt
df = pd.read_fwf(
"HadCRUT.4.6.0.0.annual_ns_avg.txt",
index_col=0,
usecols=(0, 1),
names=["year", "anomaly"],
header=None,
)
anomaly = df.loc[FIRST:LAST, "anomaly"].dropna()
reference = anomaly.loc[FIRST_REFERENCE:LAST_REFERENCE].mean()
```

This is our custom colormap, we could also use one of
the colormaps that come with `matplotlib`

, e.g. `coolwarm`

or `RdBu`

.

```
# the colors in this colormap come from http://colorbrewer2.org
# the 8 more saturated colors from the 9 blues / 9 reds
cmap = ListedColormap(
[
"#08306b",
"#08519c",
"#2171b5",
"#4292c6",
"#6baed6",
"#9ecae1",
"#c6dbef",
"#deebf7",
"#fee0d2",
"#fcbba1",
"#fc9272",
"#fb6a4a",
"#ef3b2c",
"#cb181d",
"#a50f15",
"#67000d",
]
)
```

We create a figure with a single axes object that fills the full area of the figure and does not have any axis ticks or labels.

```
fig = plt.figure(figsize=(10, 1))
ax = fig.add_axes([0, 0, 1, 1])
ax.set_axis_off()
```

Finally, we create bars for each year, assign the data, colormap and color limits and add it to the axes.

```
# create a collection with a rectangle for each year
col = PatchCollection([Rectangle((y, 0), 1, 1) for y in range(FIRST, LAST + 1)])
# set data, colormap and color limits
col.set_array(anomaly)
col.set_cmap(cmap)
col.set_clim(reference - LIM, reference + LIM)
ax.add_collection(col)
```

Make sure the axes limits are correct and save the figure.

```
ax.set_ylim(0, 1)
ax.set_xlim(FIRST, LAST + 1)
fig.savefig("warming-stripes.png")
```

Postdocs are the workers of academia. They are the main players beyond the majority of scientific papers published in journals and conferences. Yet, their effort is often not recognized in terms of salary and benefits.

A few years ago, the NIH has established stipend levels for undergraduate, predoctoral and postdoctoral trainees and fellows, the so-called NIH guidelines. Many universities and research institutes currently adopt these guidelines for deciding how much to pay postdocs.

One of the key problem of the NIH guidelines is that they are established at a
national level. This means that a postdoc in Buffalo is paid the same than a postdoc in Boston,
despite Buffalo is one of the most affordable city to live in the USA,
while Boston is one of the most expensive.
Every year, the NIH releases new guidelines, where the stipends are slightly
increased. **Do these adjustments help a postdoc in the Boston area
take home a bit more money?**

I have used Matplotlib to plot the NIH stipend levels (y axis) for each year of postdoctoral experience (x axis) for the past 4 years of NIH guidelines (color). I have also looked at the inflation of years 2017–2019 and increased the salaries of the previous year by that percentage (dashed lines).

The data revealed that the salaries of 2017 were just increased by the inflation rate for the most senior postdocs, while junior postdocs (up to 1 year of experience) received an increase more than 2.5 times of the inflation. In 2018, all salaries were just adjusted to the inflation. In 2019, the increase was slightly higher than the inflation level. So, overall, every year the NIH makes sure that the postdoc salaries are, at least, adjusted to the inflation. Great!

As mentioned earlier, there are cities in the US that are more expensive than
others, for example Boston. To partially account for such differences when
looking at the postdoc salaries, I subtracted from each salary the average rent
for a one-bedroom apartment in Boston.
Of course, it also increases every year, but, unfortunately for postdocs, **rent
increases way more than the inflation**. The results are below.

It turns out that the best year for postdocs with at least one year of experience
was actually 2016. In the subsequent years, the real estate has eaten larger and
larger portions of the postdoc salary, resulting in 2019-paid postdocs taking home
**20% less money** than 2016-paid postdocs with the same experience.

In the end, life is financially harder and harder for postdocs in the Boston area. These data should be taken into account by research institutes and universities, which have the freedom of topping up postdocs’ salaries to reflect the real cost of living of different cities.

You can download the Jupyter notebook [here](Postdoc salary Analysis.ipynb).

]]>