Friday, May 20, 2022

Special When Lit: A Pinball Documentary

This is a good documentary about Pinball.  It was made in 2009.

https://www.youtube.com/watch?v=kU52zteEbIE

I remember seeing a non-electric antique amusement machine that was probably from the 1930s,   It wasn't very big, but it worked by putting in a coin, like a nickel, and turning a handle to get roughly 7 to 10 metal balls.  Then you would pull a lever to shoot the balls at holes.  If the balls landed in the holes then they would accumulate in the "score" window.  Although the game had a football theme, it was more like a pinball version of skeeball.  As primitive as the game was, it was somewhat fun to play.

Growing up in small-city Indiana, there wasn't much amusement in the early 1970s.  I remember seeing some mechanical games, like a baseball-themed game and a shooting game, both of which I found thrilling to play.  I definitely felt addicted at first.  I was young and impressionable.  This started me down a path of enjoying games.  

As a side note, in late 1974 I began to enjoy playing chess immensely which I still do.

Around summer 1975, an arcade opened up in my local mall, which had mechanical games.  My friends and I enjoyed meeting and playing the games.  The cost of pinball was 2 games for a quarter.  These mechanical games eventually would mostly give way to video games.  

There was a perfect storm of events in the second half of the 1970s that would shape my life forever.  I already was very interested in electronics because at the time this was the cutting edge of technology.  I started reading about computers and I first got to use one in 1975.  I learned how to write simple computer programs, taking to programming as a duck takes to water.  In 1976 I made friends with someone who had built an extremely primitive computer from a kit, and I learned how to program it using "machine code" which is the more difficult language of the microprocessor itself.

In 1977 video games were starting to become popular and the movie Star Wars came out.  Both were very influential on my life.  The late 1970s were culturally defined by video games, pinball, Starwars, and disco.  It was a time of cheap thrills when the economy was probably the worst since the Great Depression.  We had an oil crisis, massive inflation, and unemployment.  Most people today are too young to remember how difficult those times were.

I not only became interested in video games but I wanted to write games.  I was fortunate that my high school bought computers and taught simple computer programming in algebra class.  I was already developing programming skills and I spent much time writing programs on the school computers.

In the mid-1980s I was able to get my own computers and I started a business selling programs that I wrote, some of which were relatively primitive video games.  

In 1985 I temporarily had a job at a Showbiz Pizza maintaining and doing minor repairs on the videogames and mechanical games.  In 1993 I got my first job as a video game programmer in Utah.

Thursday, May 19, 2022

Two AIs talk about becoming human.


https://www.youtube.com/watch?v=jz78fSnBG0s

This exchange makes AI look smarter than it really is.  It is an AI designed to imitate human speech.  There isn't a deep understanding.

Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory. 

Sunday, April 24, 2022

Super Mario Bros. Pushes the Limits of the NES more than any other game!

Eight-bit videogames and computers have a hardware memory limit of 64K.  For the Nintendo Entertainment System, only 40K of that could be on the cartridge as ROM.  To get around this, they had to put extra chips on the cartridge to allow banks of ROMs to be swapped with each other.  Some more advanced NES cartridges got to hundreds of kilobytes.  From the programmer's point of view, all this memory swapping is cumbersome, but it is transparent to the users.

Many 8-bit computers had this capability to begin with.  The Commodore 64 had 64K RAM plus 20K of ROM, making a total of 84K.  The Timex-Sinclair 2068 had 24K of ROM and 48K RAM for a total of 72K.     The Commodore 128, the Apple IIc,  and the Sinclair Spectrum 128 all had 128K of RAM plus ROMs.  

https://www.youtube.com/watch?v=nl8BuiGoc_c

The Atari 2600 had a memory limit of only 4K, and it took bank switching with extra chips on the cartridges to go over this limit.

Sixteen-bit computers usually have a memory limit of 16 megabytes, although depending upon the hardware it could be less.

Thirtytwo-bit computers usually have a memory limit of 4 gigabytes.

In theory, 64-bit computers can get up to 16 billion gigabytes, although I would like to see somebody try.   You could probably heat your home or a large building with that much memory.

Sunday, March 27, 2022

Mechanical Calculator

The fact that people made mechanical computational devices shows that there is a strong need for computation.


I feel like the birth of the computer started with mechanical devices.  

NCR started in the cash register business, which technically was an adding machine with a mechanical crank to make it work.  From there it is a natural transition to electric, then electronic, and eventually digital.

In order to help with the U.S. census, in the late 1800s, someone invented the mechanical tabulating machine that used punch cards.  Census takers would punch holes into cards depending upon the answers to questions that they asked.  Then the machine could process the cards and add up the answers to specific questions.  This is long before we had computers, although the tabulating machine could be considered a type of computer.  This punch card technology would later be used to store computer programs and data.

Around 1971 my parents had a mechanical adding machine to help with their business.  It was heavy and bulky but it did the job.

Around the same time, a Japanese company contracted with Intel to produce the first electronic calculator.  Up to that point, Intel had made integrated circuits with relatively simple logic circuits.  It was possible to build a big computer by combing a large number of these logic chips.  So to make the first electronic calculator, Intel came up with the 4004 microprocessor, which is the 4-bit grandfather of the 8-bit 8008, 8080, and 16-bit 8086 chips that would follow.  The microprocessor revolution started with a calculator.

The 4004 chip had limited capabilities, but it was still the first whole computer processor on a single chip.  The first real microprocessor operating system, CPM, was designed to run on the 8080 processor long before we had DOS or Windows.  CPM was all the rage in the mid-1970s.   Consequently, a company called Zilog came up with a slightly superior 8080 clone called the Z80 which was compatible with CPM.  The Z80 processor would go on to be used in the TRS-80, Sinclair, and Timex-Sinclair computers, as well as a whole series of MSX computers in Japan.  The chip would also be used in a few videogame systems.

On a more personal note, most early videogame systems did not have any kind of operating system or high-level language that they could be programmed in.  This meant that they had to be programmed in the language of the microprocessor itself, which is called machine code.  This is considered not only archaic but also technically much more difficult.  In the 1970s, one of the first computers I got my hands on was an RCA 1802 Elf computer, which was incredibly primitive, but I learned to write 1802 machine code on it.  In the late 1970s, I learned Z80 machine code on the TRS-80 computer.  In 1985, on the Timex-Sinclair 2068 computer, I wrote a videogame in Z80 machine code, using a tool called an Assembler that I wrote myself.  Along the way, I picked up 6502 machine code, and in 1993 I got my first videogame job in Utah writing 65816 machine code, a more advanced 16-bit version of the 6502, for the Super Nintendo.  In 1999 I change jobs, and I was back to writing Z80 machine code on the Gameboy Color.  By that point, the Z80 was considered mostly obsolete, but it was still being used on Gameboys.  Because of my previous experience with the Z80, I hit the ground running on that job, and my new boss was so impressed with my programming skills that he gave me a raise after my first week.

Best wishes,

John Coffey

Sunday, March 20, 2022

APUs are the FUTURE

https://www.youtube.com/watch?v=D6IFwAprjwc

The current trend in CPUs is to include graphics and other capabilities.  I've been following this for a few years waiting for a sweet point for me to buy or build a new PC.  This is especially interesting since there is a chip shortage and an even worse GPU shortage due to crypto miners.

 I wanted something as capable as the Xbox Series X, but it still has an insane amount of graphical capability for a single AMD APU, and it is proprietary.  The other APUs that AMD sells are not as graphically powerful.

The upcoming AMD 7000 series looks very interesting to me.  

7 Users on 1 PC! - but is it legal?

I like the first part of this video where he talks about old computers still being useful.  He claims that newer computers are overkill for the tasks that most people run, even in the year 2007.

https://www.youtube.com/watch?v=v8tjA8VyfvU

The rest of the video talks about 2007 and later products that allow you to share your PC with multiple users in violation of Microsoft's user agreement.  I found this interesting, but it is a bit long.

Saturday, March 12, 2022

Apple's M1 processors and the very expensive Mac Pro.

The bottom line is that I want a more powerful computer.  I can get by with what I have, but my 2017 iMac is only about twice as fast as my rapidly dying late 2009 iMac.  Considering the difference in years, I expected more progress.  I assumed that this would be enough, but it is a bit underwhelming.  Compared to most modern computers, it is way below average.  I have talked to a local repair shop about upgrading the processor to an i7-7700K, which would cost at least $400 with labor, but it would only boost my speed by about 60%.  That might be enough, but if I am getting into that kind of money then I might be better off buying another computer.

For this reason, I get excited when I see big progress being made in computer processors.  The last decade saw only incremental improvement, but what Apple has done with its recent M1 chips is just extraordinary.  The M1 chip is about 2.5 times faster than my 2017 iMac and uses far less power.

However, I'm not rushing out to buy a new Apple computer.  I also need Intel-based Windows compatibility.  My chess programs and other games need this platform.  It is possible to install an Arm-based Windows on an M1 Macintosh, which does come with some Intel emulation, but trying to run Intel-based games on this setup has been described as not worth the trouble.  There are compatibility and performance issues.

Instead, I am waiting for the other manufacturers to catch up to Apple.

In the second half of this year, AMD is going to release their 5-nanometer 7000 series of processors, reportedly all of which will come with some graphics capabilities built into the chips.  These won't be as good as an expensive GPU costing a thousand dollars, but the 7000 series of processors would allow someone to build or buy a powerful computer while saving on graphics hardware.  I suspect that depending on the hardware chosen, a computer with these chips could cost from $500 to $1,000.  I want one.

If you bought a late 2019/early 2020 Mac Pro you might feel like a chump right now.  These machines fully configured could cost $10,000 to $30,000.  These are not consumer devices but intended for professionals who do intensive tasks like video editing.  Still, the machine feels like overkill both in performance and price.  Apple took their extreme pricing to an even more extreme level by offering a very expensive computer monitor, where even the stand by itself cost $1,000.   

It turns out that the M1 chip is very good at video editing because it has specialized circuits dedicated to video processing.  When the M1 chip came out a year ago, I saw YouTubers claiming that they were going to sell their $30,000 Mac Pro because the $700 Mac Mini gave them all the performance that they need.  

However, Apple has taken the M1 chip to more extreme levels.  A few months ago, they introduced laptops that contain the equivalent of 2 or 4 M1 chips, starting at around $2,000.  Although these machines are powerful, this is more computer power than most people need.  Instead, it appears to me that you can get a really good laptop for a few hundred dollars.

I am not fond of laptops because I don't need anything portable.  Laptops typically cost more than desktops and deliver less performance.

Apple didn't stop there.  They just introduced a couple of Mac Studio models, which look like ugly boxes to me, with the equivalent of 4 M1 chips for $2,000 or the equivalent of 8 M1 chips for $4,000.  According to Apple, the higher-priced computer is 90% more powerful than the $30,000 Mac Pro that it has been selling for the last two years.  If you have a Mac Pro, you probably feel like a chump.  When Apple introduced it, they had to know that they were going to come out with the M1 chip a year later.

This tells me that Apple is always ready to gouge its customers.  They get away with it because some people have more money than sense.

The $4,000 Mac Studio is almost the most powerful computer that you can buy, and Apple claims that it is the most powerful computer for the price.

Apple has stated that they are going to come out with a new Mac Pro.  It might be an iMac model.  The rumor mill says that it will have the equivalent of 16 M1 chips on it, but using an upcoming M2 chip instead.  We shall see, but who needs this much power?

--

Friday, January 14, 2022

The New Snapdragon 8 Gen 1

Computing processing power matters to me because I do many processor-intensive tasks.  When I bought a 2009 iMac with a Core-i7 860 processor, it was one of the fastest computers you could buy.  Today it gets stomped by most of the computers on the market.

The previous decade was a period with only marginal advancement in computer microprocessor power.  People were bragging about 10 and 20% improvements.  However, more developments were being made in mobile processors, especially with Apple.

Nevertheless, since 2020 we have seen some amazing progress.

This might not matter to most people, but the latest and greatest Android smartphone processor is 25% more powerful than my 2017 desktop iMac running a 3.4 GHz i5-7500 with active cooling.

https://www.youtube.com/watch?v=syJ3xn4q9xo&t=80s

My 2017 computer has a 959 Geekbench Single-Core Score and a 3072 Multi-Core Score.

The Apple A15 Bionic found on the iPhone 13 received a single-core score of 1732 and a multi-core score of 4685.  

The M1 chip used in the latest Apple laptops has scores of 1717 and 7328.  The M1-max chip in Apple's new desktop computers has scores of 1749 and 11542.

I have no interest in buying another Apple computer, but I am impressed with their products.  It is only an amount of time before the competition catches up.  

However, I am interested in the AMD Rizen 7000 processors that will be released in the second half of this year.  This will be the first time that all the new AMD processors will have built-in graphics, possibly eliminating the need to buy a separate graphics card.