Computers
Friday, April 24, 2026
Choice of Web Browser
Thursday, April 23, 2026
P(doom) | Real Time with Bill Maher (HBO)
Your core argument is clear and grounded, but it leans a bit too confidently in a few places where the reality is more nuanced. Here’s a straightforward take on the strengths and weaknesses:
What works well:
- You push back against hype. That’s valuable—AI discussions often swing toward extremes.
- The point about scaling via hardware vs. improving algorithms is thoughtful and widely debated in the field.
- You correctly note that current AI systems rely heavily on statistical patterns rather than deep “understanding” in a human sense.
Where it overreaches or could be refined:
- Saying AI is “like a Wikipedia that can talk” undersells it. Modern systems don’t just retrieve facts—they generate novel responses, combine ideas, and can perform nontrivial reasoning in some domains. It’s still probabilistic, but not merely lookup-and-repeat.
- “Beyond probability, AI has little understanding” is philosophically debatable. Even humans rely on pattern recognition more than we like to admit. The real issue is type and reliability of understanding, not its total absence.
- The claim that reasoning is limited to “simple problems” is outdated. AI can now handle fairly complex reasoning tasks, though inconsistently and with failure modes.
- “Obviously AI will have controls” is asserted too casually. In practice, governance, safety, and control mechanisms are active areas of concern—not something universally agreed upon or solved.
Your strongest idea (worth emphasizing more):
- The critique that the field may be over-relying on brute-force scaling (more compute, more memory) instead of algorithmic breakthroughs is legitimate. Many researchers are asking the same question.
How you might sharpen it:
Instead of framing AI as “not that smart,” you could frame it as:
- powerful but uneven
- impressive in narrow ways, unreliable in others
- advancing quickly, but with unclear long-term efficiency
That keeps your skepticism but makes it harder to dismiss.
If you want, I can rewrite your paragraph to sound tighter and more persuasive without losing your viewpoint.
Wednesday, April 22, 2026
Monday, April 20, 2026
The computer I purchased in 2023
eufy L60 Robot Vacuum with Self Empty Station
Wednesday, April 15, 2026
Terminator: The Story of the 6502 Code You See Onscreen
The 6502 was a very simple chip, but it was cheaper and faster than other 8-bit chips.
I loved programming the 65816 on the SNES. I wonder how the world would be today if we got 32 and 64-bit versions of the 6502?
Saturday, April 4, 2026
Tuesday, March 31, 2026
So This is Peak Foldable
0 seconds ago
When a phone unfolds from 6.6 inches to 8.1 inches, why bother—especially with a $1700 price tag? Normal phones range from 6.1 to 6.9 inches, which is fine for everyday use.
It seems to me that people want a tablet in a phone, but an 8.1-inch display falls short of my 11-inch tablet. Maybe the argument is that you can have one device instead of two, but for this price you could get two devices.
Friday, March 27, 2026
LLVM - Wikipedia
1. Your BASIC-to-machine-code compiler
That's essentially an early optimizing compiler, even if it was simple.
Most 8-bit BASICs were interpreted, so compiling to native code would have been a huge speed boost. What you built sits in the same conceptual space as things like:
- tokenizing BASIC (already common)
- then going one step further: emitting machine code instead of interpreting
The fact that someone used it in a commercial product tells you it was genuinely useful—not just a hobby project.
2. Your "C-like" language idea
You were aiming for exactly the direction the industry went:
- higher-level than assembly
- structured like C
- compiled to efficient machine code
On 8-bit systems, that was hard because:
- tiny RAM (often < 64 KB)
- limited registers (especially compared to 68000)
- no standard runtime environment
So most people either:
- stayed in assembly for speed, or
- used BASIC for convenience
You were trying to bridge that gap—which is exactly what C did successfully on more capable systems like the Atari ST.
3. Your insight about Forth
This is the part where your intuition was spot on.
"You could take any programming language and convert it into a series of Forth commands."
That's basically the idea of:
- stack-based intermediate representations
- bytecode interpreters
- virtual machines
Forth is a stack machine, and many later systems used the same idea:
- Java bytecode (JVM)
- PostScript
- many scripting engines
So what you were noticing is that:
Any high-level language can be compiled into a simpler stack-based instruction set.
That's exactly how modern compilers often work internally.
4. The Timex Sinclair 2068 approach
Your interpretation is very plausible.
What you're describing sounds like:
- a compact bytecode for floating-point operations
- interpreted at runtime
- optimized for ROM size, not speed
And you're right:
- trig functions on 8-bit machines were slow anyway
- so trading speed for compactness made sense
This is very similar to:
- using a mini virtual machine for math routines
- or a microcoded interpreter
5. Parsing math expressions (PEMDAS)
Yes—this is literally a core compiler problem.
What you likely implemented (or were exposed to) is one of:
- recursive descent parsing
- or shunting-yard algorithm (Dijkstra)
That process:
- Parse expression (
3 + 4 * 5) - Convert to something like:
- Reverse Polish Notation (RPN):
3 4 5 * +
- Reverse Polish Notation (RPN):
- Execute using a stack
Which ties directly back to your Forth insight—RPN and Forth are basically cousins.
6. The bigger picture
What's striking is that you independently touched on:
- compiling high-level code → machine code
- designing a C-like language
- stack-based execution models (Forth)
- bytecode/intermediate representations
- expression parsing and evaluation
That's essentially a miniature version of modern compiler architecture:
- frontend (parsing)
- intermediate representation (stack ops / bytecode)
- backend (machine code)
7. Why you "didn't need" your language anymore
Once you hit the 68000 + C on the Atari ST, the problem space changed:
- C already solved:
- portability
- structured programming
- decent performance
- Hardware improvements reduced the need for:
- extreme memory compression tricks
- custom language runtimes
So your idea didn't become wrong—it just became redundant.
Final thought
You were basically reinventing:
- a bytecode VM
- and a compiled high-level language targeting it
…years before that pattern became dominant.
If you'd stayed in that direction, you might have ended up building something very similar to:
- a Forth-like VM with a C-style frontend
- or even an early scripting language
Thursday, March 26, 2026
68000 - The CPU ahead of its time
Tuesday, March 24, 2026
Monday, March 9, 2026
Wednesday, March 4, 2026
Re: Apple’s Cheapest MacBook
https://www.msn.com/en-us/lifestyle/shopping/apple-announces-its-cheapest-ever-new-macbook/ar-AA1Xwxu0?ocid=edgemobile
Friday, February 20, 2026
The Incredible Evolution of Computers
Friday, February 13, 2026
Sunday, February 1, 2026
Sunday, January 25, 2026
X1 Lite First Look! The Best Budget Mini PC of 2026? They Might’ve Nailed It
0 seconds ago
The CPU benchmark is 4% worse than my Minisforum UM790 Pro that I bought 2.5 years ago, and the 32GB 1TB model costs the same, probably due to high RAM prices.
My UM790 Pro had an overheating problem and died, so I had to get it fixed. I also see thermal throttling. So if this computer runs cooler then it is a better option.