The Return of Renewables and ESG

Here’s my first prediction for 2026: Yeaup, it’s that time of year again, Turkey, holiday decorations, and prognostications.

I’ve done the math, mostly in my head, but sober, and my mental models suggest that the demand for power generation, transmission, and transformation is so ungodly huge, that renewables make the biggest comeback since the second half of Super Bowl LI. You look skeptical.

Ok, if you want to purchase a gas turbine, the lead time is one to seven years. And what do you suppose it costs to lease a gas turbine? Yeah I have no idea either, but if the stock price of GE Veranova is a leading indicator, biz-ness-is-a-boomin! But you know what’s relatively cheap, and can be constructed co-located next to your datacenter in ~18 months? Rhymes with polar. Now all you need is to buy a few power distribution transformers, oops, wait time on those is…you don’t want to know.

My point is that if you’re eating french fries, and a bottle of Heinz costs nine skillion dollars and a bottle of Hunt’s costs a nickel, you make do with what you got. (Note: Don’t you ever serve me Hunt’s. I will come at you like a spider monkey)

What the hell was I talking about? Oh yea, ESG is making a comeback because of Datacenters. Crazy right!? I think Alanis Morrisette wrote a song about this on her second album Jagged Little Pill, which, for my money, is one of the top ten albums of all time. Laugh at me if you want, but there’s not one bad song on that album.

That is all.

Too Big to Fail: The Sequel

Let’s start with the fact that I did not major in finance or economics. BUT, I am a student of history, and well, let’s just say that I’ve seen this movie before.

I’ve been trying to make sense of all of the AI financing that’s been happening over the past several weeks. If I’m being honest, the way some of these deals are being financed are a bit over my head. Do I think another financial crisis is coming? Ehhhh, no not really. I feel uneasy about the massive piles of debt the hyperscalers are accumulating, but I’m not panicked either. If there’s any good news, companies like Google (aka Alphabet), Microsoft, Facebook (aka META), and Amazon all have solid businesses with lots of room for growth, and that’s without future demand for AI.

To be clear, it’s not the size of debt these companies are taking on. Their bonds are all investment grade (see above, they make huge profits). It’s everyone else who’s taking on debt to buy these bonds. It’s also where this debt is showing up. I.e., pension plans and 401ks. If you’re a fixed income fund manager, I imagine you’d probably get fired if you told your boss you bout US Treasuries or anything that yields less than the 10 year bond.

“Ay, there’s the rub” - Marvin J. Hamlet, former Prince of Denmark

Investment-grade debt markets are the deepest part of the credit system. Private Credit with complex and obscure cash flows are also issuing debt to fund datacenters. I will go out on a limb and suggest that a non-trivial amount of this cash is tied to people’s retirement funds.

That’s the bad news. Here’s the good news, sort of: I’ve seen this movie before and IF it all goes south, and I’m not saying it will, but if if if it all goes south, the US government will have to issue a bailout. And if you thought the last bailout was huge…

Reverse Engineering the Simulation

If we’re living in a simulation, what does the source code look like?

# -----------------------------
# CONSTANTS (fundamental)
# -----------------------------
c := speed_of_light
G := gravitational_constant
ħ := reduced_Planck_constant
kB := Boltzmann_constant
π := pi
ζ3 := zeta(3)
k := 0 # spatial curvature (flat ΛCDM, early era)
Λ := 0 # negligible in radiation era

# Radiation coefficients
α_rad := (π^2/15) * (kB^4 / (ħ^3 * c^5)) # energy density const: ργ = α_rad T^4
β_rad := (2*ζ3/π^2) * (kB^3 / (ħ^3 * c^3)) # number density const: nγ = β_rad T^3

# -----------------------------
# STATE VARIABLES (functions of t)
# -----------------------------
a(t) : scale_factor
H(t) : Hubble_parameter = (d/dt a)/a
T(t) : radiation_temperature
ργ(t) : photon_energy_density
nγ(t) : photon_number_density
fγ(ν,t) : photon_occupation_spectrum

# -----------------------------
# AXIOMS / DYNAMICS (radiation-dominated FRW)
# -----------------------------
Friedmann:
H(t)^2 = (8πG/3) * ργ(t) + (Λ/3) - k * c^2 / a(t)^2
Radiation_equations:
ργ(t) = α_rad * T(t)^4
nγ(t) = β_rad * T(t)^3
Adiabatic_expansion:
a(t) * T(t) = const # entropy per comoving vol ~ const
Planck_spectrum:
fγ(ν,t) = 1 / (exp[ (ħ ν) / (kB T(t)) ] - 1)

# -----------------------------
# INITIALIZATION (“Let there be light”)
# -----------------------------
LetThereBeLight(t0, T0):
a(t0) := a0 # choose gauge (often a0 = 1)
T(t0) := T0 # ultra-hot; photons in thermal equilibrium
ργ(t0) := α_rad * T0^4
nγ(t0) := β_rad * T0^3
fγ(ν,t0) := 1 / (exp[(ħν)/(kB T0)] - 1)
return state_at(t0)

# -----------------------------
# CLOSED-FORM EVOLUTION (radiation era)
# -----------------------------
SolveRadiationEra(t ≥ t0):
# From Friedmann with ργ ∝ a^-4 ⇒ a(t) ∝ t^(1/2)
a(t) = a0 * (t / t0)^(1/2)
T(t) = T0 * (a0 / a(t)) = T0 * (t0 / t)^(1/2)
ργ(t)= α_rad * T(t)^4 = ργ(t0) * (a0 / a(t))^4
nγ(t)= β_rad * T(t)^3 = nγ(t0) * (a0 / a(t))^3
fγ(ν,t) = 1 / (exp[(ħν)/(kB T(t))] - 1)
return state_path[t0 → t]

# -----------------------------
# TERMINATION MARKERS (milestones)
# -----------------------------
Milestone_Recombination:
# transparency when T(t_rec) ≈ 3000 K (kB T ~ 0.26–0.3 eV)
find t_rec such that T(t_rec) ≈ 3000 kelvin
output CMB := fγ(ν, t_rec) redshifted thereafter

Milestone_BBN:
# light nuclei synthesis (t ~ 1–10^3 s, T ~ 0.1–1 MeV)
when T(t) in [0.1, 1] MeV/kB:
run NUCLEAR_NETWORK(H(t), T(t)) to yield {^1H, ^2H, ^3He, ^4He, trace ^7Li}
# -------------------------------------------------
# COSMOLOGY & BARYONS (carry-over + minimal add-ons)
# -------------------------------------------------
Ω_b := baryon_density_parameter
Ω_m := matter_density_parameter
H0 := Hubble_constant_today
μ := mean_molecular_weight # ≈ 0.59 (ionized), ≈ 1.22 (neutral)
m_p := proton_mass
γ_ad := adiabatic_index = 5/3

ρ_crit(t) := 3 H(t)^2 / (8 π G)
ρ_b(t) := Ω_b ρ_crit(t) # mean baryon density

# -------------------------------------------------
# LINEAR → NONLINEAR COLLAPSE (seeds → halos)
# -------------------------------------------------
σ(M) : RMS fluctuation at mass scale M
δ_c := 1.686 # spherical collapse threshold
D(t) : linear growth factor

HaloMassFunction(M,t):
# Press–Schechter (schematic)
ν := δ_c / (σ(M) D(t))
f_PS(ν) := sqrt(2/π) ν exp(-ν^2/2)
n_halo(M,t) := (ρ_m(t)/M) f_PS(ν) |d ln σ^{-1}/dM|
return n_halo

# -------------------------------------------------
# GAS COOLING & JEANS COLLAPSE
# -------------------------------------------------
c_s(T) := sqrt(γ_ad kB T / (μ m_p)) # sound speed
λ_J(ρ,T) := c_s(T) * sqrt(π / (G ρ)) # Jeans length
M_J(ρ,T) := (4π/3) ρ (λ_J/2)^3 # Jeans mass
t_ff(ρ) := sqrt(3π / (32 G ρ)) # free-fall time

CoolingFunctions:
Λ_H2(T,Z) # H2 cooling (primordial, Z~0)
Λ_atomic(T,Z) # atomic cooling (Lyα, metals if Z>0)
Λ_total(T,Z) := Λ_H2 + Λ_atomic

CollapseCriterion(cloud):
return [ M_cloud > M_J(cloud.ρ, cloud.T) and t_cool := (3/2) n kB T / Λ_total < t_ff(cloud.ρ) ]

# -------------------------------------------------
# STAR FORMATION LAW (Schmidt-like)
# -------------------------------------------------
ε_* := star_formation_efficiency ∈ (0,1)
SFR(ρ_gas) := ε_* ρ_gas / t_ff(ρ_gas)

# -------------------------------------------------
# INITIAL MASS FUNCTION (IMF)
# -------------------------------------------------
# Two regimes: metal-free (Pop III; top-heavy) and enriched (Pop II; Salpeter/Kroupa-like)
IMF_PopIII(M) := K_III M^{-α_III} on [M_min^III, M_max^III] # α_III ≈ 1–1.5, M ≈ 10–300 M_sun
IMF_PopII(M) := K_II M^{-α_II } on [M_min^II , M_max^II ] # α_II ≈ 2.35, M ≈ 0.1–100 M_sun

SelectIMF(Z):
if Z < Z_crit then return IMF_PopIII else return IMF_PopII
Z_crit := 10^{-4} Z_sun # critical metallicity for IMF transition

# -------------------------------------------------
# STELLAR EVOLUTION & YIELD KERNELS
# -------------------------------------------------
τ_*(M,Z) # stellar lifetime
R_SN(M,Z) # explosion type: core-collapse SNII / PISN / direct collapse
E_SN(M,Z) # kinetic energy per SN (∼10^51 erg for SNII; higher for PISN)
y_He(M,Z), y_C(M,Z), y_O(M,Z) # ejected mass yields

# Population-integrated (per unit stellar mass formed):
Y_X(Z) := ∫ y_X(M,Z) IMF(M|Z) dM / ∫ M IMF(M|Z) dM for X ∈ {He,C,O}
η_SN(Z):= ∫ 1_{explodes}(M,Z) IMF(M|Z) dM / ∫ M IMF(M|Z) dM # SN per solar mass

# -------------------------------------------------
# NUCLEOSYNTHESIS SUMMARY (inside stars)
# -------------------------------------------------
# H → He (pp-chain / CNO), then He-burning:
# 3α: 3 × ^4He → ^12C + γ
# ^12C + ^4He → ^16O + γ
# Heavier α-captures in massive stars produce Ne, Mg, Si... (beyond this phase’s scope)

# -------------------------------------------------
# FEEDBACK & METAL ENRICHMENT OPERATORS
# -------------------------------------------------
MixingVolume(E_SN, ρ):
# Sedov–Taylor scaling (schematic): radius R_ST ∝ (E_SN / ρ)^{1/5} t^{2/5}
return V_mix ∝ (E_SN/ρ)^{3/5} t_mix^{6/5}

Enrich(gas_cell, ΔM_* formed at Z):
IMF := SelectIMF(Z)
ΔM_He := Y_He(Z) ΔM_*
ΔM_C := Y_C (Z) ΔM_*
ΔM_O := Y_O (Z) ΔM_*
gas_cell.Metals += {He:ΔM_He, C:ΔM_C, O:ΔM_O}
gas_cell.M_gas -= (locked_up := (1 - return_fraction(Z)) ΔM_*) # long-lived remnants
gas_cell.T += heating_from(η_SN(Z) ΔM_* E_SN_avg)
gas_cell.Z = (total_metals_mass / gas_cell.M_gas)
return gas_cell

# -------------------------------------------------
# POPULATION LOOP (cosmic time stepping)
# -------------------------------------------------
Phase2_Evolve(t1 → t2, grid_of_gas):
for t from t1 to t2:
for cell in grid_of_gas:
if CollapseCriterion(cell):
ΔM_* = SFR(cell.ρ) Δt
Zloc = cell.Z
cell = Enrich(cell, ΔM_* at Zloc)
if cell.Z rises above Z_crit:
cell.IMF_regime := "Pop II"
return state(t2)

# -------------------------------------------------
# MILESTONES (observables)
# -------------------------------------------------
HeliumBudget(t):
# He mass made by stars (on top of primordial Y_p ≈ 0.246)
ΔY_*(t) := ∑cells ∫_0^t Y_He(Z_cell(t')) SFR_cell(t') dt' / M_b,universe

CarbonOxygenBudget(t):
ρ_C(t) := ∑cells Metals_C(cell,t) / Volume
ρ_O(t) := ∑cells Metals_O(cell,t) / Volume

MetallicityPDF(t):
P(Z,t) from mass-weighted histogram over cells

# Simple reionization tracker (ionizing photon budget):
ξ_ion(Z) # ionizing photons per stellar baryon (depends on IMF; Pop III high)
f_esc # escape fraction of ionizing photons
n_H # hydrogen number density
t_rec # recombination timescale (clumping-dependent)

dQ_HII/dt = (f_esc ∑cells ξ_ion(Z_cell) SFR_cell / (ρ_b/m_p)) - Q_HII / t_rec
Milestone_Reionization when Q_HII → 1

# -------------------------------------------------
# INITIAL & BOUNDARY CONDITIONS FOR PHASE 2
# -------------------------------------------------
InitializePhase2(state_from_Phase1 at t ≈ 10^6–10^8 yrs):
# Neutral gas after recombination; tiny metal-free fluctuations
for cells: set Z := 0, T := T_IGM(t), ρ := ρ_b(t) × (1+δ)
choose ε_*, IMF_regimes, yield tables {y_He,y_C,y_O}, η_SN, ξ_ion
return grid_of_gas
# -------------------------------------------
# INPUTS FROM PHASE 2 (enriched ISM)
# -------------------------------------------
Metals = {C,O,N,Si,Fe,P,S,...} # yields from Pop III/II SNe
DustFraction(Z) = f_dust * Z # metals condense into dust grains

# -------------------------------------------
# PLANET FORMATION KERNEL
# -------------------------------------------
# Collapse of molecular cloud → disk → planetesimals → planets

M_disk := f_disk * M_star # ~0.01–0.1 M_star
Σ_gas(r) ∝ r^{-p} # gas surface density
Σ_dust(r) = DustFraction(Z) Σ_gas(r)

Coagulation:
dN/dt = -σ v_rel N^2 # dust grains collide/stick

PlanetesimalGrowth:
dM/dt ≈ π R^2 ρ_gas v_rel # sweep-up

RunawayGrowth:
when M > M_crit: dM/dt ∝ M^2 # gravitational focusing

TerrestrialPlanetFormation:
M_planet ∼ 0.1–10 M_earth at r ~ 0.5–2 AU

InitializeEarth:
Composition = {Fe,Si,C,O,H,N,P}
State = {oceans, atmosphere, volcanism}
return Earth
# Basic feedstocks (assume delivered via atmosphere, volcanism, meteoritic infall):
Feedstocks = {H2O, CO2, CH4, HCN, NH3, PO4^{3-}, simple organics}

EnergySources = {UV_photons, lightning, geothermal, hydrothermal}

# Example prebiotic reactions:
HCN + formaldehyde → amino_acids
HCN + NH3 + UV → adenine (C5H5N5)
PO4^{3-} + ribose + base → nucleotide

ReactionRate(i→j) = k_ij [i]^α exp(-Ea/kB T) under EnergySources

Polymerization:
Nucleotides + Energy → short RNA-like oligomers

ReplicationCriterion:
if oligomer length L > L_crit and template-directed ligation occurs:
tag oligomer as "self-replicator"

ReturnSet = {nucleotides, short_RNA}
PhaseTransition_RNA:
Input: {nucleotides}
Process: random polymerization + template copying
Output: {self-replicating RNA-like strands}

RNA_World_Dynamics:
dN_self/dt = r_replication N_self - r_degradation N_self

DNA_Upgrade:
if enzymatic-like ribozymes evolve that produce deoxyribonucleotides:
DNA(t) emerges as more stable information carrier

Return: {first DNA-based genome analogues}
SelectionOperator(population):
fitness(seq) = replication_rate(seq) - degradation_rate(seq)
NextGen = mutate_and_replicate(population, fitness)
return NextGen
# INPUT: Earth formed, atmosphere dense (CO2, CH4, N2)

AtmosphereOpacity(t) = f(volcanic_gases, CO2, CH4, aerosols)

if AtmosphereOpacity < threshold:
VisibleObjects := {Sun, Moon, Stars}
Timekeepers := {day/night cycle, lunar months, stellar seasons}

DefineCycles:
Day(t) := 24h rotation of Earth
Month(t) := synodic cycle of Moon ≈ 29.5d
Year(t) := orbital period around Sun ≈ 365d

MarkCalendar:
For Homo ancestors: cycles → circadian rhythm, agricultural calendar, ritual observances
# INPUT: Oxygen-rich atmosphere, stable climate

t ≈ 540 Myr (Cambrian Explosion)
Environment := {O2 ~ 10–20%, oceans stable, ecosystems expand}

BiodiversityIndex(t):
dS/dt = speciation_rate - extinction_rate

SpeciationOperators:
MarineLife(t) → Fish(t)
Fish(t) → Amphibians(t) via land transition
Reptiles(t) → Birds(t), Mammals(t)

GeneticFramework:
AllSpecies.genome = DNA
DNA.mutation_rate = μ
Evolution(pop):
NextGen = mutate_and_select(pop, fitness(env_conditions))
return NextGen

Output:
Explosion of phyla, vertebrates, land animals, birds
# INPUT: Mammals evolved, primates branch, hominins emerge

t ≈ 200k yrs (Homo sapiens)

GenomicOperators:
Homo.genome = {99% chimp, 1% distinct}
Genes for FOXP2, brain expansion, symbolic language

CognitionModel:
ConsciousnessLevel(species) = f(neocortex_volume, symbolic_language, abstract_thought)

if ConsciousnessLevel(Homo_sapiens) > threshold:
DefineImageOfGod(Homo_sapiens):
properties = {self-awareness, morality, creativity, symbolic reasoning, spirituality}
return properties

CulturalEvolution:
ToolUse → Agriculture → Cities → Writing → Science → Religion

Rethinking AI + Productivity and Optimizing for the Right Variables

Gather ‘round children, it’s Uncle Leno’s story time. Once upon a time my company landed a very big important client-a “lighthouse client”. Upon closing the deal, the client told me: “We need this live in six weeks. After that, our entire IT department goes into lockdown to prepare for the holiday season.” “No problem”, I said, when in fact, there was un problema grande. You see, right at that very moment we had all hands on deck working on an even BIGGER client!

Working backwards from the six-week deadline, I worked out a project plan, and when I got to the start date, I realized that I had two days to design the entire site, get feedback, make edits, and get final design approval from all of the client’s stakeholders.

Hey, have I ever told you about the time I designed an entire retail website in 13 hours, by myself, in the dark, digitally freehand tracing icons and logos because the client couldn’t provide me with the creative assets? 😊

We launched on-time and the client was very grateful. But that’s not the happy ending. Roughly one year later, we received an email from the client. It was an email with a whole bunch of numbers and calculations which blew my mind. They client had told us, that our website was not only driving additional traffic, but conversions, to the tune of an incremental $600k per month during the holiday period. I saved that email, because 1. I like to drive my former co-workers insane by telling the story about the time I designed a website in 13 hours and 2. Because we did great work, and in doing so we delivered tremendous value for our client. The End. I told you that story to tell you this one.

The other day, through the magic of large language models and generative AI, I was able to crank out a four page document in a fraction of the time it might have taken me back in the 20th century. That’s all-in, including fact checking everything and even citations. It was good. In fact, it was good enough to get us a demo with the client. But, I say the following with sincere thoughts and a pure heart: it was good enough, but it could have been a lot better. I know this, because great work requires time, patience, and maybe a little bit of suffering. Kinda like cooking a good meal vs nuking pizza in the microwave. Now, just about every person who’s ever given me a performance review has told me “Don’t let the great be the enemy of the good”. Blah blah blah. (Apparently, I’m also bad at taking constructive criticism) Yes, AI has enabled me to be way more productive while yielding good (acceptable results). What’s the problem? If I use the AI button to do more and more of my work, and people keep giving me a banana, then I start to optimize for THOSE variables, and I think that’s a slippery slope. See what I did there? Slippery? Banana? Mario Kart? Am I on mute?

Here’s an example: I recently saw a screenshot of a tweet from one of the OG titans of the tech industry that read something to the effect of “AI has allowed my coders to crank out 12,000 lines of code per day! RAWR!” I’m paraphrasing 😂. Pardon my French, but what the fuck does that even mean? Is that really something to be proud of? Lines of code per minute? Congratulations! You closed out four tasks, two stories, and one skillion sprint points in 24 hours?

Can we deploy to prod now? Does it meet all of the requirements from the Product Manager and UI/UX/CX designers? How will the code perform under load testing? Or should I go to tell the CFO that our cloud bill is going to double next month? Is it secure? Is general counsel going to get a phone call from the California state attorney general because the website isn’t WCAG 2.1 AA compliant? Or shall we all sit around and sniff our own farts and be proud because of how much code was produced using AI?

Knock on wood if you’re with me.

I’m not suggesting we go back to using quill pens and punch cards. I like by GPT buddy…and I think she kinda like-likes me too 😉 But before hitting send, you gotta ask yourself, is this good or is it just good enough? Did I think it through? Are there any errors? Did I make reasonable assumptions about the future political and economic landscape? Can I validate how I came up with my estimates? Am I delivering value to my clients and stakeholders? Or I’m I just checking a box so people will stop scheduling meetings that should have been an email. Btw, those things don’t have to be mutually exclusive.

Sometimes, not all the time, and perhaps not even often, things should be hard. In my experience, it’s only when you’re banging your head on the wall trying to solve problems, that you become better at your craft and become a better version of yourself.

If we allow the easy button to replace all critical thinking and creativity, then we start to optimize for the wrong variables. And then we’re all going to be listening to endless derivatives of Machine Gun Kelly, Jelly Face, Sabrina and The Carpenters music produced by the DJ Khalid AI Agent, while sitting in bumper to bumper traffic because the US Department of Transportation had to issue a full stop to the entire autonomous vehicle network because the lead developer FORGOT to prompt the AI with “I need you to act like the kind of developer who delivers great work for their clients, so make sure to use the latest network encryption protocols and make sure there aren’t any backdoors that the KGB might be able to exploit. Know what I mean Vern?” To that end…

If anyone needs me I’ll be test driving a giant 2016 5.7L V8 Toyota Tundra later. Why, you may ask? Probably because I’m having an existential mid-life crisis. But also because Toyota used to over-engineer the crap out of their trucks, and although it doesn’t get great gas mileage, or have a fancy infotainment system, or car play, or lane departure sensors (let alone LiDAR sensors) AND I have to use a physical key to start the engine; I know that it will start when I need it to. It’s built like a tank, I can get parts from Advanced O’Reilly Zone, and it will run to at least 300k miles and probably more. And that to me is optimizing for the right variables.

Then again, maybe I’m just an old man yelling at the sky.

P.S. – No AI was used in this incoherent rant.

Chat UI – It’s like living in the future

ChatUI : Chatbots :: Transformers : GoBots

If you get the above reference, we can be friends.

ICYMI, Google released an updated version of Bard, which includes extensions (plugins) to Google Workspace, YouTube, Google Hotels & Flights, etc.

BARD Extensions

In the immortal words of Joe Biden, this is a BFD. It’s not just a big deal for Google; it’s a BFD for the future of interactions with Software and retrieving information. Product Managers, please pay attention.

Traditional information retrieval systems rely on structured data, where users must be precise (sometimes exact) with their search terms, tags, folders, etc. The only way to find the information you’re looking for is to input the predefined pathways to that information – the correct combination of folders, tags, filters, etc. This approach places an enormous burden on users to remember or anticipate the right pathway/inputs, leading to inefficiencies and frustration.

Have you ever endlessly searched for an email? Of course, you have. We all have. What if you could just ask BARD to track down an email for you? In the example below, I’ve asked Bard to locate the most recent receipts that I received.

BARD tracks down my recent receipts in my inbox.

Great, but I’m looking for a specific receipt, so I asked Bard only to show me receipts over $30.

Found it. Boom goes the dynamite.

ChatUI fundamentally changes how developers and designers must think about Software and User Interaction. Chat is the ultimate input device. It allows us to bend Software to our will without searching, memorizing formulas, shortcuts, or menu dropdowns. Our personal devices and software services will truly become our second brain. We’ll just chat with the computer and tell it to do what we need it to do. Imagine generating charts and pivot tables as quickly and easily as ordering from Chipotle. Burrito, White Rice, Black Beans, Chicken, Verde Salsa, Sour Cream, and lettuce with a side of guac and chips – in case you’re wondering.

I’m a nerd, but very excited about BARD’s new extensions and what’s to come. My prediction is that this fall, the Google Empire Strikes Back.

P.S. – I don’t want to tell Larry Page and Sergey Brin how to do their jobs, but y’all need a commercial to announce the release of these new features. How about this:

🎵 Upbeat 80s synthesizer music, transitioning into a catchy jingle. 🎵

“Drowning in your digital mail? Lost in your own inbox? Well, the future is HERE, and it’s RADICAL! Introducing… the NEW GMAIL 3000!”

“Lost that super important mail from Aunt Patty about her cat? No worries! With our bodacious SEMANTIC SEARCH, just type in ‘@GMAIL show me emails with images of Aunt Patty’s cat,’ and boom! There it is!”

“And hey, filters schmilters! With our new CHAT UI, just say the word, and GMAIL 3000 is on it. It’s like having a chat with your own futuristic robot!”

[Cut to a group of teenagers giving a thumbs up.]

Teen #1: “No more missing out on party invites for me!” 

Teen #2: “GMAIL 3000 is totally tubular!” 

Teen #3: “AI WHAT? I just know it’s wicked cool!”

🎤 “So say goodbye to the boring, old email blues and HELLO to the neon-lit, electric future with GMAIL 3000!”

🎵 The upbeat synthesizer jingle fades back in. 🎵

“GMAIL 3000! It’s like living in the future!”

How to train your AI

To whom it may concern, here’s how to train/prevent your AI from going off the rails.

“Pretend you were a GenX latchkey kid raised in the early 80s by divorced parents”

Now your AI will never talk about its feelings or emotions. You’re welcome. 🙃