Tuesday, November 29, 2022

How Machine Language Works


I was fortunate that I got interested in computers really early, back in 1975, which was a couple of years before computers like the Apple II, Commodore Pet, and TRS-80 came out.  I was also fortunate that someone I met lent me a computer that he had built from a kit, which was an RCA Elf.  This computer was so primitive and had so little memory, only a quarter K of RAM, that you had to program it with a calculator-like keypad inserting numerical instructions into specific memory locations.  I was able to master this just enough to get a working knowledge of machine code programming.

There was a saying going back to this time period that if you knew how to program in machine code then you had a much deeper understanding of how computers work.  I learned several machine languages, and this proved very useful to me in getting jobs in the video game industry and being able to do those jobs.  When I went to work for Xanterra in 1999 to do Gameboy Color programming, I sort of hit the ground running because I already knew how to program Z80s, which I had learned in the 1970s.  The owner of the company was impressed enough with my skills that he gave me a raise after my first week.

https://www.youtube.com/watch?v=HWpi9n2H3kE

Sunday, November 20, 2022

iPad Pro M2: What Does "Pro" Even Mean?

https://www.youtube.com/watch?v=O_WbmIIy4vk

If you have a good smartphone, a tablet feels unnecessary.

The last thing I need is a tablet that is 2.5 times faster than my desktop computer.  This is the kind of power you want on a laptop or a desktop.

The M1 is only 7% slower than the M2.  

A couple of years ago, I bought the Amazon Fire tablet on Black Friday for $80, which is not a powerful tablet, but it works just fine as a portable Internet and streaming device.

4K Gamer Pro Review

https://www.youtube.com/watch?v=dL9U6n4IixQ

I did some experimenting on my computer by playing video games and videos at resolutions from 720P up to 5K. With my eyesight, which is about 20/30, on a 27-inch screen, I could not tell a difference between 1080P and higher. We are talking about levels of detail that are hard to perceive. I personally like 1440P, not that it was really an improvement over 1080P. It is maybe for me more psychological that I think that 1440P is better.

Even if you have 20/20 vision and a 60-inch screen, you are going to be sitting further back, whereas I sit very close to my 27-inch screen. Can people really tell a difference with 4K on a big-screen TV?

Many years ago somebody made a video about how 4K was unnecessary because the resolution of the human eye was not going to tell the difference. If it is unnecessary on a 55-inch TV, then it is probably unnecessary on a smartphone. I bought an iPhone 10R, which has a sub-1080P resolution, yet I never notice the resolution being too coarse.

Friday, November 18, 2022

Sprites



The word "sprite" is interesting. It means elf, fairy, or ghost, although it can also refer to flashes of different color lights in clouds caused by lightning. The word originated in the middle ages from the word "spirit". When I hear the word, I think of the Disney character Tinkerbell.

In computers and video games, a sprite is an image that can move on top of a background. Usually, these are 2D objects moving on top of a 2D background, although a game like the original Doom had 2D objects moving on top of a 3D background. The mouse pointer on a computer screen is technically a sprite.

Back in the days when computers and video games were 8-bit and 16-bit, it was helpful to have hardware support for sprites, which allowed graphical objects to move around independently of the background. The reason this was helpful was that it was more taxing for the old slow computers without hardware sprites to manipulate the graphics on the screen. When I was writing games for the Timex Sinclair 2068 and Atari ST computers, I had to write software to make all the graphics move because there was no hardware support for sprites, which makes the task more technically challenging.

The early arcade video games used hardware sprites and so did all early home video game consoles. The sprites on the Atari 2600 are extremely primitive and very difficult to program, but the programmers knew how to make them work.

Many people have touted the Commodore 64 as the best 8-bit computer because it had hardware support for eight 8x8 sprites, although this is not very many compared to the Nintendo Entertainment System that came out later. I think that the Atari 8-bit computer had better graphical capabilities overall.

Once we had 32-bit processors, there was no longer a need for hardware sprites. These systems were powerful enough that it was not a huge challenge to manipulate graphics on a screen. Also, with 32-bit systems, there was a greater emphasis on 3D graphics instead of 2D graphics.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Wednesday, October 5, 2022

How many "computers" do we own?

I remember a prediction that goes back to the 1980s that went like this: Someday you will throw away computers because your house will be littered with them. You will get computers in cereal boxes. At the time this seemed pretty far-fetched.

I am not sure that I can count the number of "computers" in my house. I have at least four obsolete iPhones, four Arcade1up machines, a videogame joystick from twenty years ago that is still fun to play, a NES Classic videogame system, a Raspberry PI, two tablet computers, a laptop, a 2009 iMac that is only half working, a 2017 iMac, and a Fire TV stick. Technically, the dumb TV and Blu-Ray player that I have are also computers. I recently gave away an old tablet that was pretty useless.

Some devices like my two external hard drives and external DVD drive might have some sort of CPU in them, although I am not sure.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Tuesday, October 4, 2022

Fire Tablets

Right now Amazon is selling some of their Fire Tablets at half price. Although they are budget tablets, not nearly as powerful as iPads, I am pretty impressed with the value for the price.
I have argued that if you have a good smartphone then you might not need a tablet, but I have enjoyed my Fire Tablet while traveling. They are more useful if you subscribe to Amazon Prime.


Wednesday, September 28, 2022

Facebook and Craigslist scams

I have noticed that a few people have created new Facebook accounts because they said that their old account was hacked or there is a problem with it. In some cases, people say that they can no longer access their old accounts.

This can get confusing because there are also fake accounts that look like the original accounts. All of this is part of scams, for example, I had a 'friend' ask to borrow money. It is always strange when you get friend requests from people who are already friends, meaning that these new friend requests are likely from fake accounts. And if they are not fake accounts then you need to verify that these people are who they say they are.

So today I encountered a scam. One of the hacked accounts had been talking to me briefly for a few days making small talk. At first, I was fooled thinking that I was talking to my friend in Salt Lake City. Then today the account tried to pull a scam that went like this:

He said that he was trying to install Facebook on his phone, but he needed a code number to finish the installation. He said that Facebook required that the code be sent to a second phone number, which makes no sense, and asked if he could use my phone number. I had already inadvertently given this person my phone number because I had asked him to give me a call so that we could chat.

I was immediately suspicious. I have seen many scams as a seller on Craigslist that will claim that they want to verify your identity with a code number and that you have to message the code number back to them. What they are actually trying to accomplish is to do a password reset on your account, which requires a code verification, so if you give them the code then they will be able to change your Craiglist password and take over your account. Then they can use your account to scam other people.

So I realized this Facebook scammer was doing the same thing trying to take over my Facebook account. I contacted my friend on his new account and he verified that this other account was not him. However, I took this one step further. I told him that I didn't know which account was the real one, the old or the new, so I asked for his phone number so that I could call him and verify his identity. As soon as he answered the phone, I knew that I was talking to the real person.
I blocked the bad account and reported it to Facebook as someone impersonating a friend.

Never give a code number (or password) to a person in a message or email. It is a scam.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com

Monday, September 19, 2022

New iPhone

My new iPhone came with a USB-C type lighting cable, but no charger.  Those bastards.

All my old lightning cables and chargers are USB-A.  I almost ordered a USB-C charger, but why bother just to be able to use the cable that came in the box?

I'm having trouble setting up the new iPhone, so I will have to finish tonight.  However, FACE-ID works and I think that this is awesome.  The fingerprint system never worked for me.

--

Sunday, September 18, 2022

New iPhone and screen resolution

Recently, I cracked my iPhone 6s+ screen for the third time. I should be getting a new iPhone in the mail tomorrow.

The 10R is an ultra cheap version of the iPhone 10. I'm going from a 5.5 inch 1080P screen to a 6.1 inch 828P screen. Apple claims that this is a Retina Display, but it is only barely so. I was a bit annoyed when they came out with this phone four years ago, because it was inconceivable to me that Apple would go less than 1080P.

I've been arguing for a while that screen resolution is overrated. This will put my claim to the test.

For example, I tried playing videos and video games at various resolutions from 720P to a whopping 4K and 5K. Anything above 1080P is really hard for me to tell the difference. I like 1080P video the best, although 720P video is not terrible, but for a computer monitor I prefer 1440P. My 2017 iMac has a fantastic 5K display, but for a 27-inch screen this seems like overkill.

Best wishes,

John Coffey

Saturday, September 17, 2022

VP9 vs h264 & other Codecs » How it Fares Against Competition | Bitmovin

https://bitmovin.com/vp9-codec-status-quo/

I am using an extension to Chrome that forces Youtube to use the h264 video format (codec) instead of VP8/VP9.  (https://chrome.google.com/webstore/detail/h264ify/aleakchihdccplidncghkekgioiakgal)

Why is this an issue?   

Some newer graphics cards support the decoding of VP8/VP9, which is a more efficient standard.  However, it is not supported in hardware on older computers like my 2017 iMac nor all the iPhone Models.  These devices have hardware support for H264, meaning that H264 is easier to decode, and won't slow your device, won't make it as hot, nor use as much power.

Your web browser can decode VP8/VP9, but your device may not have hardware support for it.  If speed and battery life aren't an issue then it doesn't matter, and VP8/VP9 will likely use less Internet bandwidth.    If I were using a laptop, I would probably want this extension.

It is possible that most people won't notice a difference either way.  I do a lot of stuff on my computer so it might take a difference.

Sunday, September 4, 2022

The things AMD DIDN’T tell us…

I've been waiting for this next generation of AMD chips.  I would prefer a new APU that would allow graphics on the same chip, but reportedly this is coming out next year.  The 7950x is 8 times more powerful than my desktop computer.

https://www.youtube.com/watch?v=hnAna6TTTQc&t=111s

Saturday, July 2, 2022

Build a PC while you still can

https://www.youtube.com/watch?v=LFQ3LkVF5sM

I'm not sure that I agree.  Although I have been waiting for the next AMD APU, having your graphics and processor on one chip gives you no versatility.  Apple achieved outstanding performance by having CPU, graphics, and memory all on the same M1 chip, which increased efficiency, but there is only so much that you can put on a CPU die, and there is no upgrade path except to buy a new computer.

Saturday, June 18, 2022

Puzzle Leaderboard - Chess Rankings - Chess.com

https://www.chess.com/leaderboard/tactics?page=171

I'm #8540 on the chess.com puzzle ratings.  I was expecting that the top ratings would not go too high, but I was wrong.   The top three ratings are all 65540, which for the reasons I give below, I suspect is the highest possible rating.


I find this 65540 number suspicious because the top three ratings are this number.  The maximum value that can be stored by a 16-bit number is 65536.   If you want to save storage space, why use a 32-bit or a 64-bit number to store ratings when a 16-bit number would do?  The 65540 number almost fits.  You can make it fit by making the lowest possible rating the number 5.  Why would you set a lower limit on the rating?  To not accidentally run into a divide by zero problem, which can crash computer code, or other mathematical oddities from having a low or negative number in your equation.



Tuesday, June 14, 2022

Guide: What to do AFTER building your computer...

Re: Watch "Apple just killed M1 - WWDC 2022" on YouTube

I have a big side note here.  Microprocessor instructions are either 8-bit or more likely 16 bits long.  Each instruction represents a particular action to be taken by the microprocessor in machine code.  A typical machine code instruction might be to add one register's value to another and put the result in a specified location.  This would be the equivalent of the "add to memory" button on a calculator.

x86 processors are Complex Instruction Set Computers (CISC), where Arm processors are Reduced Instruction Set Computers.  RISC tries to be more efficient by doing fewer things faster.  There is a 30-year-old battle of architecture between RISC and CISC, and right now RISC is winning.

I wonder if there might be a better way.  I have long imagined a 16-bit or longer instruction where each bit represents a particular action to be taken by the microprocessor.  For example, one bit might be to fetch data from memory, and another bit might be to fetch data from a register, and a third bit might be to add them together.  On a 64-bit processor with a 64-bit data bus, there would be no problem having 64 individual bits each presenting a particular action to be performed.  Such a system might allow the combining of actions in novel ways or the performing of some actions in parallel, increasing efficiency.

On Tue, Jun 14, 2022 at 6:05 PM John Coffey <john2001plus@gmail.com> wrote:
I've seen parts of this before.  M1 is alive and well.  M2 is a more powerful design, but only marginally better than M1, and only will be available initially on a couple of platforms.  I would choose M2 over M1, but I would rather have a 5 nano-meter x86 chip, likely an AMD APU.

The current generation AMD 5700G would check most of my boxes.  It has a PassMark score of 20K compared to 6K of the  i5-7500 on my 2017 iMac, and the roughly 14.6K Pasmark Score of the M1.  BTW, your laptop scored around 10K, but that is still noticeably better than my desktop.   My first priority was to get a more powerful processor and the 5700G certainly fits the bill.  However, the graphics capability is around 900 gigaflops, compared to about 3.6 Teraflops on my iMac and 12 Teraflops on the Xbox series X which uses a custom AMD APU.  In other words, it is not a gaming powerhouse.

I could buy a 5950x with no graphics, which gets a PassMark score of around 40,000.  Then I would have to buy a graphics card.

So there are very strong rumors, which AMD has mostly confirmed, that they are coming out with a "Phoenix" line of APU processors that have much more graphics capability comparable to or surpassing some lower-end graphics cards.  This is what I am waiting for.


--

On Mon, Jun 13, 2022 at 11:19 PM Alberwrote:
This is a good overview of the Apple M2 chip and other Apple upgrades. The new chip is very impressive and will probably have Intel and AMD trying to do something similar in the future.




Tuesday, May 31, 2022

The best $589 you can spend at Apple

https://www.youtube.com/watch?v=-CWTBzZcp_k

Apple silicon doesn't run the Windows programs that I use, at least not without using some sort of very elaborate emulation.  Still, it is over twice as fast as my 2017 iMac desktop.


Tuesday, May 24, 2022

Space Invaders: Atari Archive Episode 32

For those who have some interest in video games.

https://www.youtube.com/watch?v=ad3TLYZOI-M

I can't emphasize enough how difficult it is to write programs on the Atari 2600, also called the Atari VCS.  Since the machine only had 128 bytes of RAM, there is no video memory at all.  Instead, as the raster draws the picture on the TV screen, the microprocessor has to constantly send information to the display as to which pixels to draw.  It is a miracle that it can display anything at all.  The code necessary to draw the screen is contained in the ROM cartridge.  Most of the microprocessor time is spent drawing the screen, and any game logic had to be done during the television vertical blank period, which is the period of time that the electron gun moves from the bottom of the screen back to the top of the screen to start the next frame.  The vertical blank happens for about 1330 microseconds, sixty times per second.

There were a few rare 2600 cartridges that would have extra chips on them to boost the memory or the capabilities of the machine.  These special cartridges only got made when the chips became cheaper, like in the late 1980s which was near the end of the life of the 2600 game system.

Some early primitive computers with limited memory, like the Sinclair ZX80, ZX81, and Timex-Sinclair 1000, also used the microprocessor to draw the display.  This didn't involve computer code like on the 2600, but a hardware trick to get the microprocessor to copy bytes from the memory to the display.   It is my understanding that the first McIntosh computer lost about 40% of its processor time driving its display.

Memory limitations would drive the graphics on all videogame systems and computers throughout the 1980s.  Instead of every pixel having its own unique memory location, which has been true since the mid-90s, the screen would be made up of tiles, or blocks, which are like the characters on a computer keyboard.  Each tile could be defined to whatever you want, usually with a limited number of colors.  When I was programming on the Super Nintendo, the artists would create the tiles, and the program would tell the tiles where to display on the screen.  Objects that move on the screen are called "Sprites", and the hardware displays these in front of the background tiles and they are made up of their own separate tiles.  Since the mid-1990s these kinds of display methods were no longer necessary because the chips were faster and the systems had more memory.



I’m tired of winning (and it's awesome)

https://www.youtube.com/watch?v=zTZuD4fVutc

The next generation of AMD CPUs coming this year has a big boost in CPU power, but not a big boost in integrated graphics.  I've been wanting a powerful APU, which has a CPU and powerful graphics on the same chip, saving the cost of a separate graphics card, like the custom chips that are on the XBOX Series X and the Sony PlayStation 5.  

The current generation AMD 5950x is a beast of a processor and can play games, but its graphics capability is very low compared to the videogame systems.

However, the next generation of AMD APUs is not coming out till next year or maybe the 4th quarter of this year as laptop processors.  If I want a powerful CPU and reasonably powerful graphics then either I would have to buy a new CPU and a graphics card, or settle for an upcoming laptop processor.  I think that 2023 should bring me some good options, although I was hoping to upgrade this year. 

My 2017 iMac can play games better than I expected.  It has a low-end graphics card like what would be found in a laptop.  However, the CPU power is unimpressive.  I have the option of upgrading the processor to an i7-7700K, at a cost of $350 to $400, but I would still be a few years out of date.  The better option is to wait for the next generation.

Friday, May 20, 2022

Special When Lit: A Pinball Documentary

This is a good documentary about Pinball.  It was made in 2009.

https://www.youtube.com/watch?v=kU52zteEbIE

I remember seeing a non-electric antique amusement machine that was probably from the 1930s,   It wasn't very big, but it worked by putting in a coin, like a nickel, and turning a handle to get roughly 7 to 10 metal balls.  Then you would pull a lever to shoot the balls at holes.  If the balls landed in the holes then they would accumulate in the "score" window.  Although the game had a football theme, it was more like a pinball version of skeeball.  As primitive as the game was, it was somewhat fun to play.

Growing up in small-city Indiana, there wasn't much amusement in the early 1970s.  I remember seeing some mechanical games, like a baseball-themed game and a shooting game, both of which I found thrilling to play.  I definitely felt addicted at first.  I was young and impressionable.  This started me down a path of enjoying games.  

As a side note, in late 1974 I began to enjoy playing chess immensely which I still do.

Around summer 1975, an arcade opened up in my local mall, which had mechanical games.  My friends and I enjoyed meeting and playing the games.  The cost of pinball was 2 games for a quarter.  These mechanical games eventually would mostly give way to video games.  

There was a perfect storm of events in the second half of the 1970s that would shape my life forever.  I already was very interested in electronics because at the time this was the cutting edge of technology.  I started reading about computers and I first got to use one in 1975.  I learned how to write simple computer programs, taking to programming as a duck takes to water.  In 1976 I made friends with someone who had built an extremely primitive computer from a kit, and I learned how to program it using "machine code" which is the more difficult language of the microprocessor itself.

In 1977 video games were starting to become popular and the movie Star Wars came out.  Both were very influential on my life.  The late 1970s were culturally defined by video games, pinball, Starwars, and disco.  It was a time of cheap thrills when the economy was probably the worst since the Great Depression.  We had an oil crisis, massive inflation, and unemployment.  Most people today are too young to remember how difficult those times were.

I not only became interested in video games but I wanted to write games.  I was fortunate that my high school bought computers and taught simple computer programming in algebra class.  I was already developing programming skills and I spent much time writing programs on the school computers.

In the mid-1980s I was able to get my own computers and I started a business selling programs that I wrote, some of which were relatively primitive video games.  

In 1985 I temporarily had a job at a Showbiz Pizza maintaining and doing minor repairs on the videogames and mechanical games.  In 1993 I got my first job as a video game programmer in Utah.

Thursday, May 19, 2022

Two AIs talk about becoming human.


https://www.youtube.com/watch?v=jz78fSnBG0s

This exchange makes AI look smarter than it really is.  It is an AI designed to imitate human speech.  There isn't a deep understanding.

Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory.