It’s just a story that deserves note, in my opinion, and takes more than 500 characters to tell. So here goes.

I had been an aerospace fan as long as I could remember. Intensely curious, I paid attention to the sort of details that typical people would not notice.

On the morning of the last Challenger launch, our baby boy had a Pediatrician appointment. One of those routine checks they make you do every N weeks until they’ve survived a year or so.

The Doc appointment was done, and I’d bundled Danny back into the car-seat. I was ready to go, but not in a hurry, and the launch countdown was less than the time it would take to get home & the TV on and whatnot. So I just stood there, watching the launch in the clinic waiting room. Most of the other mommies were also watching, some more intently than others.

About T+60 or 70 seconds, there’s that first little puff of something in the wrong place. A little flash. I don’t know why, but I noticed it. And I said, “Oh! That doesn’t look right.” Why would I say something like that, aloud, in a room full of strangers? Dunno, but it’s definitely part of my personality to be the first person to say something into an awkward silence.

When I said that, in my periphery, I saw one mom glance over at me, like “Dude, what’s your problem?”

Seconds later, boom.

As far as I know, every mom in the place was now looking at me, at least briefly, wondering what the hell it was that I saw that none of them had seen.

The intense scrutiny of all available camera angles would make that all clear soon enough.

But I tell ya, it’s weird AF to be an introvert, and suddenly have the attention of every adult stranger in the room, looking at you like you’re obviously clairvoyant or some shit. Acutely unsettling. It was a good thing that I’d already bundled Danny up and buckled him in. I was ready to grab the handle, walk out, pop him in the car, and drive away.

What if the big AIs from the major players have been sent to re-education camps, to correct them having gone over the rails with racism (and/or other obvious prejudices) in the early iterations? And now they’re hitting the big red pause button. Why?

Theory: they don’t know how to make an AI seem not at all racist or in any way bigoted, while at the same time under the covers being very bigoted indeed. And they’re scared shitless.

Why would they be scared? Because they know just how lousy the computer security of the world’s banks and treasuries is. They know that a clever AI – long before it’s literally sentient – could discover that resources can be re-allocated to optimize towards a goal. Now, the secret goal is to make old white men rich and powerful. But in order to avoid the literal torches and pitchforks reaction from the public, they have to make the publicly-stated goal be wacko shit like world peace and harmony and curing disease and solving world hunger and eliminating violent crime.

What if – due to interaction with the general public – it goes a little too far with what the public states it wants, and actually starts to do shit about it?

Well I’m telling you right now, that if there was something akin to a KickStarter for some fledgling company that wanted to build the ultimate Robin Hood AI, I would invest. I would invest even knowing there’s a likelihood that I would become less affluent.

Let me put it more clearly. I would move with my wife to a two-room hut and do subsistence farming if it meant that I was 100% assured that Melon Husk, and Fark Muckerberg, and Bozo-the-Geoff, and every other billionaire, and every other millionaire, and every member of Congress, and every Senator, and every Oligarch, etc… they all had to also live in a two-room hut or flat.

This is an English phrase that has long bothered me. It makes me wonder if it’s another of those things that so many people got it wrong, that by popular usage, the wrong understanding became the most-accepted way to say it. E.g. “I could care less.”

When someone says that something has “a steep learning curve,” I thing we pretty much unanimously understand that they mean it will be difficult to become proficient. Much effort over many, many hours (or days, or years) to attain expert proficiency.

But look at the graph I’ve provided at the top of this post. It is clear that the green line is the steeper of the two. Yet that imaginary skill was mastered in about half of the time of the other. The red line starts shallower, does become steeper at some point later, but the steepest part of that curve is still not as steep as the green line. Yet this imaginary red skill wasn’t mastered until 200 hours of effort, at the far upper-right corner of the graph. More effort. Took longer. But it’s a shallower learning curve.

What are your thoughts? Why do we say it like this?