Tuesday, May 31, 2022

The best $589 you can spend at Apple

https://www.youtube.com/watch?v=-CWTBzZcp_k

Apple silicon doesn't run the Windows programs that I use, at least not without using some sort of very elaborate emulation.  Still, it is over twice as fast as my 2017 iMac desktop.


Tuesday, May 24, 2022

Space Invaders: Atari Archive Episode 32

For those who have some interest in video games.

https://www.youtube.com/watch?v=ad3TLYZOI-M

I can't emphasize enough how difficult it is to write programs on the Atari 2600, also called the Atari VCS.  Since the machine only had 128 bytes of RAM, there is no video memory at all.  Instead, as the raster draws the picture on the TV screen, the microprocessor has to constantly send information to the display as to which pixels to draw.  It is a miracle that it can display anything at all.  The code necessary to draw the screen is contained in the ROM cartridge.  Most of the microprocessor time is spent drawing the screen, and any game logic had to be done during the television vertical blank period, which is the period of time that the electron gun moves from the bottom of the screen back to the top of the screen to start the next frame.  The vertical blank happens for about 1330 microseconds, sixty times per second.

There were a few rare 2600 cartridges that would have extra chips on them to boost the memory or the capabilities of the machine.  These special cartridges only got made when the chips became cheaper, like in the late 1980s which was near the end of the life of the 2600 game system.

Some early primitive computers with limited memory, like the Sinclair ZX80, ZX81, and Timex-Sinclair 1000, also used the microprocessor to draw the display.  This didn't involve computer code like on the 2600, but a hardware trick to get the microprocessor to copy bytes from the memory to the display.   It is my understanding that the first McIntosh computer lost about 40% of its processor time driving its display.

Memory limitations would drive the graphics on all videogame systems and computers throughout the 1980s.  Instead of every pixel having its own unique memory location, which has been true since the mid-90s, the screen would be made up of tiles, or blocks, which are like the characters on a computer keyboard.  Each tile could be defined to whatever you want, usually with a limited number of colors.  When I was programming on the Super Nintendo, the artists would create the tiles, and the program would tell the tiles where to display on the screen.  Objects that move on the screen are called "Sprites", and the hardware displays these in front of the background tiles and they are made up of their own separate tiles.  Since the mid-1990s these kinds of display methods were no longer necessary because the chips were faster and the systems had more memory.



I’m tired of winning (and it's awesome)

https://www.youtube.com/watch?v=zTZuD4fVutc

The next generation of AMD CPUs coming this year has a big boost in CPU power, but not a big boost in integrated graphics.  I've been wanting a powerful APU, which has a CPU and powerful graphics on the same chip, saving the cost of a separate graphics card, like the custom chips that are on the XBOX Series X and the Sony PlayStation 5.  

The current generation AMD 5950x is a beast of a processor and can play games, but its graphics capability is very low compared to the videogame systems.

However, the next generation of AMD APUs is not coming out till next year or maybe the 4th quarter of this year as laptop processors.  If I want a powerful CPU and reasonably powerful graphics then either I would have to buy a new CPU and a graphics card, or settle for an upcoming laptop processor.  I think that 2023 should bring me some good options, although I was hoping to upgrade this year. 

My 2017 iMac can play games better than I expected.  It has a low-end graphics card like what would be found in a laptop.  However, the CPU power is unimpressive.  I have the option of upgrading the processor to an i7-7700K, at a cost of $350 to $400, but I would still be a few years out of date.  The better option is to wait for the next generation.

Friday, May 20, 2022

Special When Lit: A Pinball Documentary

This is a good documentary about Pinball.  It was made in 2009.

https://www.youtube.com/watch?v=kU52zteEbIE

I remember seeing a non-electric antique amusement machine that was probably from the 1930s,   It wasn't very big, but it worked by putting in a coin, like a nickel, and turning a handle to get roughly 7 to 10 metal balls.  Then you would pull a lever to shoot the balls at holes.  If the balls landed in the holes then they would accumulate in the "score" window.  Although the game had a football theme, it was more like a pinball version of skeeball.  As primitive as the game was, it was somewhat fun to play.

Growing up in small-city Indiana, there wasn't much amusement in the early 1970s.  I remember seeing some mechanical games, like a baseball-themed game and a shooting game, both of which I found thrilling to play.  I definitely felt addicted at first.  I was young and impressionable.  This started me down a path of enjoying games.  

As a side note, in late 1974 I began to enjoy playing chess immensely which I still do.

Around summer 1975, an arcade opened up in my local mall, which had mechanical games.  My friends and I enjoyed meeting and playing the games.  The cost of pinball was 2 games for a quarter.  These mechanical games eventually would mostly give way to video games.  

There was a perfect storm of events in the second half of the 1970s that would shape my life forever.  I already was very interested in electronics because at the time this was the cutting edge of technology.  I started reading about computers and I first got to use one in 1975.  I learned how to write simple computer programs, taking to programming as a duck takes to water.  In 1976 I made friends with someone who had built an extremely primitive computer from a kit, and I learned how to program it using "machine code" which is the more difficult language of the microprocessor itself.

In 1977 video games were starting to become popular and the movie Star Wars came out.  Both were very influential on my life.  The late 1970s were culturally defined by video games, pinball, Starwars, and disco.  It was a time of cheap thrills when the economy was probably the worst since the Great Depression.  We had an oil crisis, massive inflation, and unemployment.  Most people today are too young to remember how difficult those times were.

I not only became interested in video games but I wanted to write games.  I was fortunate that my high school bought computers and taught simple computer programming in algebra class.  I was already developing programming skills and I spent much time writing programs on the school computers.

In the mid-1980s I was able to get my own computers and I started a business selling programs that I wrote, some of which were relatively primitive video games.  

In 1985 I temporarily had a job at a Showbiz Pizza maintaining and doing minor repairs on the videogames and mechanical games.  In 1993 I got my first job as a video game programmer in Utah.

Thursday, May 19, 2022

Two AIs talk about becoming human.


https://www.youtube.com/watch?v=jz78fSnBG0s

This exchange makes AI look smarter than it really is.  It is an AI designed to imitate human speech.  There isn't a deep understanding.

Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory.