When I first started messing around with programming in the 8th grade math lab at Crenshaw Jr. High, we started out with Atari BASIC.

10 PRINT "Hello World!"
20 GOTO 10

BASIC was a strictly typed language. To create a string variable, you had to indicate it was a string by giving it an appropriate name. Strings had to end with a $. You also had to allocate memory for your strings, called dimensioning:

10 DIM MYSTRING$(13)
20 MYSTRING$ = "Hello World!"
30 PRINT MYSTRING$
40 MYSTRING$ = "This is Caspar."
50 PRINT MYSTRING$
60 END

On line 30, you get Hello World printed to the screen. On line 50 you get This is Caspa because the string was dimensioned with space for 13 characters.

This seemed at the time to be a royal pain in the ass. But it was presented as "just the way computers work", so we accepted it as gospel that programmers needed to make good guesses about memory allocation. When you're working with an 8-bit system and only 48K of RAM, you have to be careful about using memory. That's why the Commodore 64 was such a big deal – another 16K to play with. But you still had to use it wisely. Then the first IBM PCs shipped with 128K. Holy crap!

When the IBM PC came out and my father bought one for his law office, I spent hours downtown in the evening. By then we had read William Poundstone's Recursive Universe which had in the back of it a couple printouts of his program for the "Game of Life". One in BASIC and the other in assembly code. We started with what we knew and typed in the BASIC program. It ran slow as molasses. So we next figured out the assembly code. It required a compiler – this was new to me, since BASIC is a runtime scripting language. But once we got it working, it was (compared to the BASIC version) blazing fast. That's the first I can remember being addicted to processing speed.

Lesson learned: interpreters are slow, compiled binaries are fast.

In college I programmed drivers for running equipment in the physics lab: step motors for turning mirrors to direct laser beams. That kind of thing. We needed immediate response. I did it in assembly.

Then we were modeling non-linear crystal formations in particle aggregations from solution. Pascal was the language of choice on campus at the time. One of the math professors wrote a Pascal program to visualize the process. I don't know anything about Pascal. All I know is it ran slow as molasses. This was just another "Game of Life" problem. So I wrote one in assembly.

When I presented it, the professors thought they were watching a screen recording of the program. "No," I said. "This is running live right now." My chemistry professor couldn't believe it: "This is running live? Holy shit!"

In retrospect, I should have kept with that. When you have the ability to create something that makes people exclaim, "Holy shit!" (in a good way) you're probably onto something.

But I didn't. I went to seminary, and then on to pastoral ministry. And while I was doing that, the internet happened and I missed it.

Assembly code isn't really much of a language. You only get a few basic operations that let you directly manipulate the bits being pushed in and out of the processor's registers and where they get bused out to various locations in memory. So you can't really say it's strongly or weakly typed. It's just bits, and depending on the processor you get to arrange them in bytes (8 bits), words (16 bits), and doubles (32 bits). (After I left off, they started building processors that could handle quads – 64 bits at a time. Again, holy shit!)

By the time I started messing around with computers again it was post-Y2K. By then processors were fast enough that interpreted languages weren't slow as molasses any more. (They still are actually slower than compiled languages, but almost nobody cares.) I wanted to build a website for my church because that had become a thing. PHP, I was told, was "the way" to do it.

That's how I got started with PHP. But things had changed.

Functions had replaced line numbers for program flow. Nobody cared about memory usage anymore. And it seemed magical that you didn't have to care about data types. You could even mix and match them:

function saySomething($something, $times) {
  for ($i=1; $i <= $times; $i++) {
    echo "For the " . $i . " of " . $times . ": " . $something;
  }
}

saySomething("Hello World!", 5);

Look at that! $times is both a number and a string! It's like New Shimmer is a floor wax and a dessert topping!

This is all great. Until it isn't. "Type juggling" can only go so far. Sometimes you need one type or another. This led to a lot of manual type checking in PHP code:

function doSomethingWith($aThing) {
  if (is_array($aThing)) {
    // do stuff
  } else {
    // do some kind of fallback stuff
  }
}

Once PHP introduced object data types, you had to be careful again. If you pass a string to a function that needs an specific kind of object, the interpreter has no way to guess about it. You get a fatal error: call to method on non-object.

To help with this, you could type-hint arrays and objects but not much else. And in my experience hardly anyone did. A couple years ago I had a regular job working on a project where thousands of errors were being generated daily. (I'll let that sink in. Thousands. Daily.) A large part of that was type mismatches. Also, there were no tests.

There are lots of languages besides PHP are loosely typed. PHP, though, seems to have a reputation for being sloppy. My best guess is that it's not that PHP (at least modern PHP) is sloppy, but that people who program in PHP tend to be sloppy, and that's what people see the most of.

The most recent versions of PHP (7+) provide for much more robust type hinting. You can even enforce type hints now on a per-file basis. For some this is an unwelcome change. "Who has time for type-hinting?" was the title of a post from some disgruntled person somewhere recently. But I welcome it.

The dark side of the magic that makes variables usable as both a number and a string is that you have to keep track of what it is and whether what it is is important or not. In my older age, remembering things is getting harder and harder. (I suspect it's hard for younger people, too, but they're just not ready to admit it yet.) If PHP is going to give me the luxury of letting the interpreter do that for me, that's a good thing. The interpreter doesn't forget. It throws an error that I can fix right away instead of letting it get to production and then trying to figure out WTF for hours when it starts throwing errors on other people's websites.

Hours are more precious too, the older I get. "Who has time for type hinting?" Let me put it this way: if you want to hire me to work on your PHP project, but you don't have time for type hinting (and unit tests, but that's another story), I don't really have the time. You may as well have written your project in BASIC.