Friday, December 30, 2022

Avatar Review



"Avatar" is a science fiction epic directed by James Cameron and released in 2009. The movie tells the story of Jake Sully, a disabled former Marine who is sent to the distant planet of Pandora to participate in a corporate-funded project to mine a valuable mineral called unobtanium. While there, Jake falls in love with the native Pandora inhabitants, the Na'vi, and becomes torn between his loyalty to his human employers and his growing connection to the Pandora ecosystem and its inhabitants.

One of the standout elements of "Avatar" is its groundbreaking visuals and special effects, which were groundbreaking at the time of its release and hold up well even by today's standards. The use of motion capture technology and 3D animation allowed the filmmakers to create fully-realized, lifelike characters and breathtakingly realistic environments that are a joy to behold. The movie's action scenes are also well-choreographed and exciting, with impressive set pieces that showcase the unique creatures and landscapes of Pandora.

The story of "Avatar" is not particularly original, with elements of the "white savior" trope and a simplistic, good-versus-evil narrative that pits the human characters against the Na'vi. However, the movie's themes of environmentalism and cultural imperialism are timely and thought-provoking, and the performances of the cast, particularly Sam Worthington as Jake and Zoe Saldana as the Na'vi warrior Neytiri, are strong.

Overall, "Avatar" is a visually stunning and entertaining action adventure that is worth seeing for its groundbreaking special effects and strong performances. While its story may be somewhat simplistic and familiar, the movie's themes and visuals more than make up for it.

--
Best wishes,

John Coffey


P.S.  I didn't write this review.  An AI called chatGPT did.




Tuesday, December 20, 2022

What It's Like To be a Computer: An Interview with GPT-3

There is this thing called the Turing Test, invented by Alan Turing seventy years ago.  The idea is to see if a computer could become smart enough to fool a human into thinking he is talking to a real person.  We have crossed a threshold where computers have almost reached this point.

https://www.youtube.com/watch?v=PqbB07n_uQ4

The AI appears to understand more than it actually does.  It has studied human conversation and a mountain of raw information so that it can imitate a human conversation.

However, having a conversational computer isn't the only threshold the machines have crossed recently.  Computer AI has become much more useful, performing all kinds of new tasks, such as surgery or writing computer code.  By the end of the decade, machines will be performing many more jobs.  It is very likely that in the next couple of decades, or even in this one, we will have general-purpose robots that could perform any task that we want them to do.

Friday, December 16, 2022

The Current State of Windows on ARM-Architecture (& Its Promising Future)

The industry is moving toward efficient ARM processors, in an effort to catch up to Apple.  

https://www.youtube.com/watch?v=psbucvxF-UU&t=468s

This video is talking about a future processor.  I saw another video claiming that Microsoft's current ARM-based devices fall way short of what Apple's M-series chips can do.

Although the video claims that there is no problem running X86 programs on ARM processors using emulation (https://youtu.be/psbucvxF-UU?t=195), reportedly many videogames have difficulty or don't work at all under emulation.  However, if we are moving toward two competing hardware architectures, I hope that many software makers will compile their software to work on both platforms.  For video games, it can be more complicated because the GPU hardware can be different.

I don't think that Intel will stand still as their processors get out-competed.  All they would need to do is come up with more powerful processors to recapture the market.  They also are planning to come out with 3-nanometer chips in a couple of years.

However, ARM's main strength is power efficiency, so it will remain popular on portable devices.

According to one video, the industry is moving away from PCs as we know them today, and toward System on a Chip devices like what Apple has done.  Although these tend to be more efficient, you can't change the hardware configuration, like the memory size, after you buy them.



NintenDeen's Questioning if Donkey Kong Country 2 is the Greatest 2D Pla...

https://youtu.be/0F813wYuhVk

Tuesday, November 29, 2022

How Machine Language Works


I was fortunate that I got interested in computers really early, back in 1975, which was a couple of years before computers like the Apple II, Commodore Pet, and TRS-80 came out.  I was also fortunate that someone I met lent me a computer that he had built from a kit, which was an RCA Elf.  This computer was so primitive and had so little memory, only a quarter K of RAM, that you had to program it with a calculator-like keypad inserting numerical instructions into specific memory locations.  I was able to master this just enough to get a working knowledge of machine code programming.

There was a saying going back to this time period that if you knew how to program in machine code then you had a much deeper understanding of how computers work.  I learned several machine languages, and this proved very useful to me in getting jobs in the video game industry and being able to do those jobs.  When I went to work for Xanterra in 1999 to do Gameboy Color programming, I sort of hit the ground running because I already knew how to program Z80s, which I had learned in the 1970s.  The owner of the company was impressed enough with my skills that he gave me a raise after my first week.

https://www.youtube.com/watch?v=HWpi9n2H3kE

Sunday, November 20, 2022

iPad Pro M2: What Does "Pro" Even Mean?

https://www.youtube.com/watch?v=O_WbmIIy4vk

If you have a good smartphone, a tablet feels unnecessary.

The last thing I need is a tablet that is 2.5 times faster than my desktop computer.  This is the kind of power you want on a laptop or a desktop.

The M1 is only 7% slower than the M2.  

A couple of years ago, I bought the Amazon Fire tablet on Black Friday for $80, which is not a powerful tablet, but it works just fine as a portable Internet and streaming device.

4K Gamer Pro Review

https://www.youtube.com/watch?v=dL9U6n4IixQ

I did some experimenting on my computer by playing video games and videos at resolutions from 720P up to 5K. With my eyesight, which is about 20/30, on a 27-inch screen, I could not tell a difference between 1080P and higher. We are talking about levels of detail that are hard to perceive. I personally like 1440P, not that it was really an improvement over 1080P. It is maybe for me more psychological that I think that 1440P is better.

Even if you have 20/20 vision and a 60-inch screen, you are going to be sitting further back, whereas I sit very close to my 27-inch screen. Can people really tell a difference with 4K on a big-screen TV?

Many years ago somebody made a video about how 4K was unnecessary because the resolution of the human eye was not going to tell the difference. If it is unnecessary on a 55-inch TV, then it is probably unnecessary on a smartphone. I bought an iPhone 10R, which has a sub-1080P resolution, yet I never notice the resolution being too coarse.

Friday, November 18, 2022

Sprites



The word "sprite" is interesting. It means elf, fairy, or ghost, although it can also refer to flashes of different color lights in clouds caused by lightning. The word originated in the middle ages from the word "spirit". When I hear the word, I think of the Disney character Tinkerbell.

In computers and video games, a sprite is an image that can move on top of a background. Usually, these are 2D objects moving on top of a 2D background, although a game like the original Doom had 2D objects moving on top of a 3D background. The mouse pointer on a computer screen is technically a sprite.

Back in the days when computers and video games were 8-bit and 16-bit, it was helpful to have hardware support for sprites, which allowed graphical objects to move around independently of the background. The reason this was helpful was that it was more taxing for the old slow computers without hardware sprites to manipulate the graphics on the screen. When I was writing games for the Timex Sinclair 2068 and Atari ST computers, I had to write software to make all the graphics move because there was no hardware support for sprites, which makes the task more technically challenging.

The early arcade video games used hardware sprites and so did all early home video game consoles. The sprites on the Atari 2600 are extremely primitive and very difficult to program, but the programmers knew how to make them work.

Many people have touted the Commodore 64 as the best 8-bit computer because it had hardware support for eight 8x8 sprites, although this is not very many compared to the Nintendo Entertainment System that came out later. I think that the Atari 8-bit computer had better graphical capabilities overall.

Once we had 32-bit processors, there was no longer a need for hardware sprites. These systems were powerful enough that it was not a huge challenge to manipulate graphics on a screen. Also, with 32-bit systems, there was a greater emphasis on 3D graphics instead of 2D graphics.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Wednesday, October 5, 2022

How many "computers" do we own?

I remember a prediction that goes back to the 1980s that went like this: Someday you will throw away computers because your house will be littered with them. You will get computers in cereal boxes. At the time this seemed pretty far-fetched.

I am not sure that I can count the number of "computers" in my house. I have at least four obsolete iPhones, four Arcade1up machines, a videogame joystick from twenty years ago that is still fun to play, a NES Classic videogame system, a Raspberry PI, two tablet computers, a laptop, a 2009 iMac that is only half working, a 2017 iMac, and a Fire TV stick. Technically, the dumb TV and Blu-Ray player that I have are also computers. I recently gave away an old tablet that was pretty useless.

Some devices like my two external hard drives and external DVD drive might have some sort of CPU in them, although I am not sure.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Tuesday, October 4, 2022

Fire Tablets

Right now Amazon is selling some of their Fire Tablets at half price. Although they are budget tablets, not nearly as powerful as iPads, I am pretty impressed with the value for the price.
I have argued that if you have a good smartphone then you might not need a tablet, but I have enjoyed my Fire Tablet while traveling. They are more useful if you subscribe to Amazon Prime.


Wednesday, September 28, 2022

Facebook and Craigslist scams

I have noticed that a few people have created new Facebook accounts because they said that their old account was hacked or there is a problem with it. In some cases, people say that they can no longer access their old accounts.

This can get confusing because there are also fake accounts that look like the original accounts. All of this is part of scams, for example, I had a 'friend' ask to borrow money. It is always strange when you get friend requests from people who are already friends, meaning that these new friend requests are likely from fake accounts. And if they are not fake accounts then you need to verify that these people are who they say they are.

So today I encountered a scam. One of the hacked accounts had been talking to me briefly for a few days making small talk. At first, I was fooled thinking that I was talking to my friend in Salt Lake City. Then today the account tried to pull a scam that went like this:

He said that he was trying to install Facebook on his phone, but he needed a code number to finish the installation. He said that Facebook required that the code be sent to a second phone number, which makes no sense, and asked if he could use my phone number. I had already inadvertently given this person my phone number because I had asked him to give me a call so that we could chat.

I was immediately suspicious. I have seen many scams as a seller on Craigslist that will claim that they want to verify your identity with a code number and that you have to message the code number back to them. What they are actually trying to accomplish is to do a password reset on your account, which requires a code verification, so if you give them the code then they will be able to change your Craiglist password and take over your account. Then they can use your account to scam other people.

So I realized this Facebook scammer was doing the same thing trying to take over my Facebook account. I contacted my friend on his new account and he verified that this other account was not him. However, I took this one step further. I told him that I didn't know which account was the real one, the old or the new, so I asked for his phone number so that I could call him and verify his identity. As soon as he answered the phone, I knew that I was talking to the real person.
I blocked the bad account and reported it to Facebook as someone impersonating a friend.

Never give a code number (or password) to a person in a message or email. It is a scam.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Monday, September 19, 2022

New iPhone

My new iPhone came with a USB-C type lighting cable, but no charger.  Those bastards.

All my old lightning cables and chargers are USB-A.  I almost ordered a USB-C charger, but why bother just to be able to use the cable that came in the box?

I'm having trouble setting up the new iPhone, so I will have to finish tonight.  However, FACE-ID works and I think that this is awesome.  The fingerprint system never worked for me.

--

Sunday, September 18, 2022

New iPhone and screen resolution

Recently, I cracked my iPhone 6s+ screen for the third time. I should be getting a new iPhone in the mail tomorrow.

The 10R is an ultra cheap version of the iPhone 10. I'm going from a 5.5 inch 1080P screen to a 6.1 inch 828P screen. Apple claims that this is a Retina Display, but it is only barely so. I was a bit annoyed when they came out with this phone four years ago, because it was inconceivable to me that Apple would go less than 1080P.

I've been arguing for a while that screen resolution is overrated. This will put my claim to the test.

For example, I tried playing videos and video games at various resolutions from 720P to a whopping 4K and 5K. Anything above 1080P is really hard for me to tell the difference. I like 1080P video the best, although 720P video is not terrible, but for a computer monitor I prefer 1440P. My 2017 iMac has a fantastic 5K display, but for a 27-inch screen this seems like overkill.

Best wishes,

John Coffey

Saturday, September 17, 2022

VP9 vs h264 & other Codecs » How it Fares Against Competition | Bitmovin

https://bitmovin.com/vp9-codec-status-quo/

I am using an extension to Chrome that forces Youtube to use the h264 video format (codec) instead of VP8/VP9.  (https://chrome.google.com/webstore/detail/h264ify/aleakchihdccplidncghkekgioiakgal)

Why is this an issue?   

Some newer graphics cards support the decoding of VP8/VP9, which is a more efficient standard.  However, it is not supported in hardware on older computers like my 2017 iMac nor all the iPhone Models.  These devices have hardware support for H264, meaning that H264 is easier to decode, and won't slow your device, won't make it as hot, nor use as much power.

Your web browser can decode VP8/VP9, but your device may not have hardware support for it.  If speed and battery life aren't an issue then it doesn't matter, and VP8/VP9 will likely use less Internet bandwidth.    If I were using a laptop, I would probably want this extension.

It is possible that most people won't notice a difference either way.  I do a lot of stuff on my computer so it might take a difference.

Sunday, September 4, 2022

The things AMD DIDN’T tell us…

I've been waiting for this next generation of AMD chips.  I would prefer a new APU that would allow graphics on the same chip, but reportedly this is coming out next year.  The 7950x is 8 times more powerful than my desktop computer.

https://www.youtube.com/watch?v=hnAna6TTTQc&t=111s

Saturday, July 2, 2022

Build a PC while you still can

https://www.youtube.com/watch?v=LFQ3LkVF5sM

I'm not sure that I agree.  Although I have been waiting for the next AMD APU, having your graphics and processor on one chip gives you no versatility.  Apple achieved outstanding performance by having CPU, graphics, and memory all on the same M1 chip, which increased efficiency, but there is only so much that you can put on a CPU die, and there is no upgrade path except to buy a new computer.

Saturday, June 18, 2022

Puzzle Leaderboard - Chess Rankings - Chess.com

https://www.chess.com/leaderboard/tactics?page=171

I'm #8540 on the chess.com puzzle ratings.  I was expecting that the top ratings would not go too high, but I was wrong.   The top three ratings are all 65540, which for the reasons I give below, I suspect is the highest possible rating.


I find this 65540 number suspicious because the top three ratings are this number.  The maximum value that can be stored by a 16-bit number is 65536.   If you want to save storage space, why use a 32-bit or a 64-bit number to store ratings when a 16-bit number would do?  The 65540 number almost fits.  You can make it fit by making the lowest possible rating the number 5.  Why would you set a lower limit on the rating?  To not accidentally run into a divide by zero problem, which can crash computer code, or other mathematical oddities from having a low or negative number in your equation.



Tuesday, June 14, 2022

Guide: What to do AFTER building your computer...

Re: Watch "Apple just killed M1 - WWDC 2022" on YouTube

I have a big side note here.  Microprocessor instructions are either 8-bit or more likely 16 bits long.  Each instruction represents a particular action to be taken by the microprocessor in machine code.  A typical machine code instruction might be to add one register's value to another and put the result in a specified location.  This would be the equivalent of the "add to memory" button on a calculator.

x86 processors are Complex Instruction Set Computers (CISC), where Arm processors are Reduced Instruction Set Computers.  RISC tries to be more efficient by doing fewer things faster.  There is a 30-year-old battle of architecture between RISC and CISC, and right now RISC is winning.

I wonder if there might be a better way.  I have long imagined a 16-bit or longer instruction where each bit represents a particular action to be taken by the microprocessor.  For example, one bit might be to fetch data from memory, and another bit might be to fetch data from a register, and a third bit might be to add them together.  On a 64-bit processor with a 64-bit data bus, there would be no problem having 64 individual bits each presenting a particular action to be performed.  Such a system might allow the combining of actions in novel ways or the performing of some actions in parallel, increasing efficiency.

On Tue, Jun 14, 2022 at 6:05 PM John Coffey <john2001plus@gmail.com> wrote:
I've seen parts of this before.  M1 is alive and well.  M2 is a more powerful design, but only marginally better than M1, and only will be available initially on a couple of platforms.  I would choose M2 over M1, but I would rather have a 5 nano-meter x86 chip, likely an AMD APU.

The current generation AMD 5700G would check most of my boxes.  It has a PassMark score of 20K compared to 6K of the  i5-7500 on my 2017 iMac, and the roughly 14.6K Pasmark Score of the M1.  BTW, your laptop scored around 10K, but that is still noticeably better than my desktop.   My first priority was to get a more powerful processor and the 5700G certainly fits the bill.  However, the graphics capability is around 900 gigaflops, compared to about 3.6 Teraflops on my iMac and 12 Teraflops on the Xbox series X which uses a custom AMD APU.  In other words, it is not a gaming powerhouse.

I could buy a 5950x with no graphics, which gets a PassMark score of around 40,000.  Then I would have to buy a graphics card.

So there are very strong rumors, which AMD has mostly confirmed, that they are coming out with a "Phoenix" line of APU processors that have much more graphics capability comparable to or surpassing some lower-end graphics cards.  This is what I am waiting for.


--

On Mon, Jun 13, 2022 at 11:19 PM Alberwrote:
This is a good overview of the Apple M2 chip and other Apple upgrades. The new chip is very impressive and will probably have Intel and AMD trying to do something similar in the future.




Tuesday, May 31, 2022

The best $589 you can spend at Apple

https://www.youtube.com/watch?v=-CWTBzZcp_k

Apple silicon doesn't run the Windows programs that I use, at least not without using some sort of very elaborate emulation.  Still, it is over twice as fast as my 2017 iMac desktop.


Tuesday, May 24, 2022

Space Invaders: Atari Archive Episode 32

For those who have some interest in video games.

https://www.youtube.com/watch?v=ad3TLYZOI-M

I can't emphasize enough how difficult it is to write programs on the Atari 2600, also called the Atari VCS.  Since the machine only had 128 bytes of RAM, there is no video memory at all.  Instead, as the raster draws the picture on the TV screen, the microprocessor has to constantly send information to the display as to which pixels to draw.  It is a miracle that it can display anything at all.  The code necessary to draw the screen is contained in the ROM cartridge.  Most of the microprocessor time is spent drawing the screen, and any game logic had to be done during the television vertical blank period, which is the period of time that the electron gun moves from the bottom of the screen back to the top of the screen to start the next frame.  The vertical blank happens for about 1330 microseconds, sixty times per second.

There were a few rare 2600 cartridges that would have extra chips on them to boost the memory or the capabilities of the machine.  These special cartridges only got made when the chips became cheaper, like in the late 1980s which was near the end of the life of the 2600 game system.

Some early primitive computers with limited memory, like the Sinclair ZX80, ZX81, and Timex-Sinclair 1000, also used the microprocessor to draw the display.  This didn't involve computer code like on the 2600, but a hardware trick to get the microprocessor to copy bytes from the memory to the display.   It is my understanding that the first McIntosh computer lost about 40% of its processor time driving its display.

Memory limitations would drive the graphics on all videogame systems and computers throughout the 1980s.  Instead of every pixel having its own unique memory location, which has been true since the mid-90s, the screen would be made up of tiles, or blocks, which are like the characters on a computer keyboard.  Each tile could be defined to whatever you want, usually with a limited number of colors.  When I was programming on the Super Nintendo, the artists would create the tiles, and the program would tell the tiles where to display on the screen.  Objects that move on the screen are called "Sprites", and the hardware displays these in front of the background tiles and they are made up of their own separate tiles.  Since the mid-1990s these kinds of display methods were no longer necessary because the chips were faster and the systems had more memory.



I’m tired of winning (and it's awesome)

https://www.youtube.com/watch?v=zTZuD4fVutc

The next generation of AMD CPUs coming this year has a big boost in CPU power, but not a big boost in integrated graphics.  I've been wanting a powerful APU, which has a CPU and powerful graphics on the same chip, saving the cost of a separate graphics card, like the custom chips that are on the XBOX Series X and the Sony PlayStation 5.  

The current generation AMD 5950x is a beast of a processor and can play games, but its graphics capability is very low compared to the videogame systems.

However, the next generation of AMD APUs is not coming out till next year or maybe the 4th quarter of this year as laptop processors.  If I want a powerful CPU and reasonably powerful graphics then either I would have to buy a new CPU and a graphics card, or settle for an upcoming laptop processor.  I think that 2023 should bring me some good options, although I was hoping to upgrade this year. 

My 2017 iMac can play games better than I expected.  It has a low-end graphics card like what would be found in a laptop.  However, the CPU power is unimpressive.  I have the option of upgrading the processor to an i7-7700K, at a cost of $350 to $400, but I would still be a few years out of date.  The better option is to wait for the next generation.

Friday, May 20, 2022

Special When Lit: A Pinball Documentary

This is a good documentary about Pinball.  It was made in 2009.

https://www.youtube.com/watch?v=kU52zteEbIE

I remember seeing a non-electric antique amusement machine that was probably from the 1930s,   It wasn't very big, but it worked by putting in a coin, like a nickel, and turning a handle to get roughly 7 to 10 metal balls.  Then you would pull a lever to shoot the balls at holes.  If the balls landed in the holes then they would accumulate in the "score" window.  Although the game had a football theme, it was more like a pinball version of skeeball.  As primitive as the game was, it was somewhat fun to play.

Growing up in small-city Indiana, there wasn't much amusement in the early 1970s.  I remember seeing some mechanical games, like a baseball-themed game and a shooting game, both of which I found thrilling to play.  I definitely felt addicted at first.  I was young and impressionable.  This started me down a path of enjoying games.  

As a side note, in late 1974 I began to enjoy playing chess immensely which I still do.

Around summer 1975, an arcade opened up in my local mall, which had mechanical games.  My friends and I enjoyed meeting and playing the games.  The cost of pinball was 2 games for a quarter.  These mechanical games eventually would mostly give way to video games.  

There was a perfect storm of events in the second half of the 1970s that would shape my life forever.  I already was very interested in electronics because at the time this was the cutting edge of technology.  I started reading about computers and I first got to use one in 1975.  I learned how to write simple computer programs, taking to programming as a duck takes to water.  In 1976 I made friends with someone who had built an extremely primitive computer from a kit, and I learned how to program it using "machine code" which is the more difficult language of the microprocessor itself.

In 1977 video games were starting to become popular and the movie Star Wars came out.  Both were very influential on my life.  The late 1970s were culturally defined by video games, pinball, Starwars, and disco.  It was a time of cheap thrills when the economy was probably the worst since the Great Depression.  We had an oil crisis, massive inflation, and unemployment.  Most people today are too young to remember how difficult those times were.

I not only became interested in video games but I wanted to write games.  I was fortunate that my high school bought computers and taught simple computer programming in algebra class.  I was already developing programming skills and I spent much time writing programs on the school computers.

In the mid-1980s I was able to get my own computers and I started a business selling programs that I wrote, some of which were relatively primitive video games.  

In 1985 I temporarily had a job at a Showbiz Pizza maintaining and doing minor repairs on the videogames and mechanical games.  In 1993 I got my first job as a video game programmer in Utah.

Thursday, May 19, 2022

Two AIs talk about becoming human.


https://www.youtube.com/watch?v=jz78fSnBG0s

This exchange makes AI look smarter than it really is.  It is an AI designed to imitate human speech.  There isn't a deep understanding.

Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory. 

Sunday, April 24, 2022

Super Mario Bros. Pushes the Limits of the NES more than any other game!

Eight-bit videogames and computers have a hardware memory limit of 64K.  For the Nintendo Entertainment System, only 40K of that could be on the cartridge as ROM.  To get around this, they had to put extra chips on the cartridge to allow banks of ROMs to be swapped with each other.  Some more advanced NES cartridges got to hundreds of kilobytes.  From the programmer's point of view, all this memory swapping is cumbersome, but it is transparent to the users.

Many 8-bit computers had this capability to begin with.  The Commodore 64 had 64K RAM plus 20K of ROM, making a total of 84K.  The Timex-Sinclair 2068 had 24K of ROM and 48K RAM for a total of 72K.     The Commodore 128, the Apple IIc,  and the Sinclair Spectrum 128 all had 128K of RAM plus ROMs.  

https://www.youtube.com/watch?v=nl8BuiGoc_c

The Atari 2600 had a memory limit of only 4K, and it took bank switching with extra chips on the cartridges to go over this limit.

Sixteen-bit computers usually have a memory limit of 16 megabytes, although depending upon the hardware it could be less.

Thirtytwo-bit computers usually have a memory limit of 4 gigabytes.

In theory, 64-bit computers can get up to 16 billion gigabytes, although I would like to see somebody try.   You could probably heat your home or a large building with that much memory.

Sunday, March 27, 2022

Mechanical Calculator

The fact that people made mechanical computational devices shows that there is a strong need for computation.


I feel like the birth of the computer started with mechanical devices.  

NCR started in the cash register business, which technically was an adding machine with a mechanical crank to make it work.  From there it is a natural transition to electric, then electronic, and eventually digital.

In order to help with the U.S. census, in the late 1800s, someone invented the mechanical tabulating machine that used punch cards.  Census takers would punch holes into cards depending upon the answers to questions that they asked.  Then the machine could process the cards and add up the answers to specific questions.  This is long before we had computers, although the tabulating machine could be considered a type of computer.  This punch card technology would later be used to store computer programs and data.

Around 1971 my parents had a mechanical adding machine to help with their business.  It was heavy and bulky but it did the job.

Around the same time, a Japanese company contracted with Intel to produce the first electronic calculator.  Up to that point, Intel had made integrated circuits with relatively simple logic circuits.  It was possible to build a big computer by combing a large number of these logic chips.  So to make the first electronic calculator, Intel came up with the 4004 microprocessor, which is the 4-bit grandfather of the 8-bit 8008, 8080, and 16-bit 8086 chips that would follow.  The microprocessor revolution started with a calculator.

The 4004 chip had limited capabilities, but it was still the first whole computer processor on a single chip.  The first real microprocessor operating system, CPM, was designed to run on the 8080 processor long before we had DOS or Windows.  CPM was all the rage in the mid-1970s.   Consequently, a company called Zilog came up with a slightly superior 8080 clone called the Z80 which was compatible with CPM.  The Z80 processor would go on to be used in the TRS-80, Sinclair, and Timex-Sinclair computers, as well as a whole series of MSX computers in Japan.  The chip would also be used in a few videogame systems.

On a more personal note, most early videogame systems did not have any kind of operating system or high-level language that they could be programmed in.  This meant that they had to be programmed in the language of the microprocessor itself, which is called machine code.  This is considered not only archaic but also technically much more difficult.  In the 1970s, one of the first computers I got my hands on was an RCA 1802 Elf computer, which was incredibly primitive, but I learned to write 1802 machine code on it.  In the late 1970s, I learned Z80 machine code on the TRS-80 computer.  In 1985, on the Timex-Sinclair 2068 computer, I wrote a videogame in Z80 machine code, using a tool called an Assembler that I wrote myself.  Along the way, I picked up 6502 machine code, and in 1993 I got my first videogame job in Utah writing 65816 machine code, a more advanced 16-bit version of the 6502, for the Super Nintendo.  In 1999 I change jobs, and I was back to writing Z80 machine code on the Gameboy Color.  By that point, the Z80 was considered mostly obsolete, but it was still being used on Gameboys.  Because of my previous experience with the Z80, I hit the ground running on that job, and my new boss was so impressed with my programming skills that he gave me a raise after my first week.

Best wishes,

John Coffey

Sunday, March 20, 2022

APUs are the FUTURE

https://www.youtube.com/watch?v=D6IFwAprjwc

The current trend in CPUs is to include graphics and other capabilities.  I've been following this for a few years waiting for a sweet point for me to buy or build a new PC.  This is especially interesting since there is a chip shortage and an even worse GPU shortage due to crypto miners.

 I wanted something as capable as the Xbox Series X, but it still has an insane amount of graphical capability for a single AMD APU, and it is proprietary.  The other APUs that AMD sells are not as graphically powerful.

The upcoming AMD 7000 series looks very interesting to me.  

7 Users on 1 PC! - but is it legal?

I like the first part of this video where he talks about old computers still being useful.  He claims that newer computers are overkill for the tasks that most people run, even in the year 2007.

https://www.youtube.com/watch?v=v8tjA8VyfvU

The rest of the video talks about 2007 and later products that allow you to share your PC with multiple users in violation of Microsoft's user agreement.  I found this interesting, but it is a bit long.

Saturday, March 12, 2022

Apple's M1 processors and the very expensive Mac Pro.

The bottom line is that I want a more powerful computer.  I can get by with what I have, but my 2017 iMac is only about twice as fast as my rapidly dying late 2009 iMac.  Considering the difference in years, I expected more progress.  I assumed that this would be enough, but it is a bit underwhelming.  Compared to most modern computers, it is way below average.  I have talked to a local repair shop about upgrading the processor to an i7-7700K, which would cost at least $400 with labor, but it would only boost my speed by about 60%.  That might be enough, but if I am getting into that kind of money then I might be better off buying another computer.

For this reason, I get excited when I see big progress being made in computer processors.  The last decade saw only incremental improvement, but what Apple has done with its recent M1 chips is just extraordinary.  The M1 chip is about 2.5 times faster than my 2017 iMac and uses far less power.

However, I'm not rushing out to buy a new Apple computer.  I also need Intel-based Windows compatibility.  My chess programs and other games need this platform.  It is possible to install an Arm-based Windows on an M1 Macintosh, which does come with some Intel emulation, but trying to run Intel-based games on this setup has been described as not worth the trouble.  There are compatibility and performance issues.

Instead, I am waiting for the other manufacturers to catch up to Apple.

In the second half of this year, AMD is going to release their 5-nanometer 7000 series of processors, reportedly all of which will come with some graphics capabilities built into the chips.  These won't be as good as an expensive GPU costing a thousand dollars, but the 7000 series of processors would allow someone to build or buy a powerful computer while saving on graphics hardware.  I suspect that depending on the hardware chosen, a computer with these chips could cost from $500 to $1,000.  I want one.

If you bought a late 2019/early 2020 Mac Pro you might feel like a chump right now.  These machines fully configured could cost $10,000 to $30,000.  These are not consumer devices but intended for professionals who do intensive tasks like video editing.  Still, the machine feels like overkill both in performance and price.  Apple took their extreme pricing to an even more extreme level by offering a very expensive computer monitor, where even the stand by itself cost $1,000.   

It turns out that the M1 chip is very good at video editing because it has specialized circuits dedicated to video processing.  When the M1 chip came out a year ago, I saw YouTubers claiming that they were going to sell their $30,000 Mac Pro because the $700 Mac Mini gave them all the performance that they need.  

However, Apple has taken the M1 chip to more extreme levels.  A few months ago, they introduced laptops that contain the equivalent of 2 or 4 M1 chips, starting at around $2,000.  Although these machines are powerful, this is more computer power than most people need.  Instead, it appears to me that you can get a really good laptop for a few hundred dollars.

I am not fond of laptops because I don't need anything portable.  Laptops typically cost more than desktops and deliver less performance.

Apple didn't stop there.  They just introduced a couple of Mac Studio models, which look like ugly boxes to me, with the equivalent of 4 M1 chips for $2,000 or the equivalent of 8 M1 chips for $4,000.  According to Apple, the higher-priced computer is 90% more powerful than the $30,000 Mac Pro that it has been selling for the last two years.  If you have a Mac Pro, you probably feel like a chump.  When Apple introduced it, they had to know that they were going to come out with the M1 chip a year later.

This tells me that Apple is always ready to gouge its customers.  They get away with it because some people have more money than sense.

The $4,000 Mac Studio is almost the most powerful computer that you can buy, and Apple claims that it is the most powerful computer for the price.

Apple has stated that they are going to come out with a new Mac Pro.  It might be an iMac model.  The rumor mill says that it will have the equivalent of 16 M1 chips on it, but using an upcoming M2 chip instead.  We shall see, but who needs this much power?

--

Friday, January 14, 2022

The New Snapdragon 8 Gen 1

Computing processing power matters to me because I do many processor-intensive tasks.  When I bought a 2009 iMac with a Core-i7 860 processor, it was one of the fastest computers you could buy.  Today it gets stomped by most of the computers on the market.

The previous decade was a period with only marginal advancement in computer microprocessor power.  People were bragging about 10 and 20% improvements.  However, more developments were being made in mobile processors, especially with Apple.

Nevertheless, since 2020 we have seen some amazing progress.

This might not matter to most people, but the latest and greatest Android smartphone processor is 25% more powerful than my 2017 desktop iMac running a 3.4 GHz i5-7500 with active cooling.

https://www.youtube.com/watch?v=syJ3xn4q9xo&t=80s

My 2017 computer has a 959 Geekbench Single-Core Score and a 3072 Multi-Core Score.

The Apple A15 Bionic found on the iPhone 13 received a single-core score of 1732 and a multi-core score of 4685.  

The M1 chip used in the latest Apple laptops has scores of 1717 and 7328.  The M1-max chip in Apple's new desktop computers has scores of 1749 and 11542.

I have no interest in buying another Apple computer, but I am impressed with their products.  It is only an amount of time before the competition catches up.  

However, I am interested in the AMD Rizen 7000 processors that will be released in the second half of this year.  This will be the first time that all the new AMD processors will have built-in graphics, possibly eliminating the need to buy a separate graphics card.