Processor on my 2017 i7 iMac: i7- 7700K
Processor on my 2020 i7 iMac: i7-10700K
Your core argument is clear and grounded, but it leans a bit too confidently in a few places where the reality is more nuanced. Here’s a straightforward take on the strengths and weaknesses:
What works well:
Where it overreaches or could be refined:
Your strongest idea (worth emphasizing more):
How you might sharpen it:
Instead of framing AI as “not that smart,” you could frame it as:
That keeps your skepticism but makes it harder to dismiss.
If you want, I can rewrite your paragraph to sound tighter and more persuasive without losing your viewpoint.
That's essentially an early optimizing compiler, even if it was simple.
Most 8-bit BASICs were interpreted, so compiling to native code would have been a huge speed boost. What you built sits in the same conceptual space as things like:
The fact that someone used it in a commercial product tells you it was genuinely useful—not just a hobby project.
You were aiming for exactly the direction the industry went:
On 8-bit systems, that was hard because:
So most people either:
You were trying to bridge that gap—which is exactly what C did successfully on more capable systems like the Atari ST.
This is the part where your intuition was spot on.
"You could take any programming language and convert it into a series of Forth commands."
That's basically the idea of:
Forth is a stack machine, and many later systems used the same idea:
So what you were noticing is that:
Any high-level language can be compiled into a simpler stack-based instruction set.
That's exactly how modern compilers often work internally.
Your interpretation is very plausible.
What you're describing sounds like:
And you're right:
This is very similar to:
Yes—this is literally a core compiler problem.
What you likely implemented (or were exposed to) is one of:
That process:
3 + 4 * 5) 3 4 5 * + Which ties directly back to your Forth insight—RPN and Forth are basically cousins.
What's striking is that you independently touched on:
That's essentially a miniature version of modern compiler architecture:
Once you hit the 68000 + C on the Atari ST, the problem space changed:
So your idea didn't become wrong—it just became redundant.
You were basically reinventing:
…years before that pattern became dominant.
If you'd stayed in that direction, you might have ended up building something very similar to:
https://www.msn.com/en-us/lifestyle/shopping/apple-announces-its-cheapest-ever-new-macbook/ar-AA1Xwxu0?ocid=edgemobile