Wednesday, March 8, 2023

"I made Mario Kart for the NES!!! "


I worked on some Test Drive games for the Gameboy Color, which were similar in concept.

This isn't actually a NES game, but an imitation of what a NES game would look like.

I've downloaded the game and will try it later.

Life Before the Internet… You had to do WHAT!

This should be self-evident, but I grew up (mostly) before computers, microwave ovens, CD's, DVD's, and VCR's. High tech was a stereo system, transistor radio, or color TV.

https://www.youtube.com/watch?v=o_nb7LlLGzc


Tuesday, March 7, 2023

M2 Mac Mini


I have come to believe that the M2 Mac Mini is a con and this is what we have come to expect from Apple.  If you want the coolness of Apple products, then you have to pay the Apple Tax. The M2 Mac Mini is successful because Apple does a really good job of hyping its products.

I want a faster computer than what I have now, and the M2 is most definitely faster, but the base model has four times less memory and storage.  Upgrading is cost-prohibitive because of what Apple charges for those upgrades, which I think is an insult.  Once you buy it, you have a computer that is not upgradeable, won't run my Windows games, and could die in five years due to SSD failure that can't be repaired. 

I recently saw a gaming PC on sale for $700.  Not only did it have a faster processor, more storage, and 32 gigs of RAM, but those components can be replaced if they fail.  Any decent PC could last up to 10 years. 


There are mini PCs that can give the Mac Mini a run for its money. I'm holding out for the AMD 7940hs, which should be at least as powerful if not more so. 

There are use cases, like content creation or app development, where the M2 processor with a memory upgrade might make sense.

Tuesday, February 28, 2023

Chat GPT

Talk about Paul Morphy.

Paul Morphy (1837-1884) was an American chess player who is widely regarded as one of the greatest chess players of all time. He was born in New Orleans, Louisiana, and began playing chess at a very young age, learning the game from his father and uncle. Morphy showed an early talent for chess and quickly developed his skills.

In 1857, Morphy traveled to New York to compete in the first American Chess Congress. He won the tournament, defeating several of the best players in the country, including Alexander Beaufort Meek and Louis Paulsen. Following his success in New York, Morphy traveled to Europe to compete against the best chess players in the world.

In Europe, Morphy quickly established himself as a dominant force in chess. He defeated many of the top players of the time, including Adolf Anderssen, the winner of the 1851 London Chess Tournament, and Howard Staunton, the leading English player. Morphy's style of play was aggressive and attacking, and he was known for his brilliant combinations and sacrifices.

Despite his success, Morphy's career was relatively short-lived. He retired from competitive chess at the age of 21 and returned to his law studies in Louisiana. Morphy's retirement from chess was a great loss to the game, and many chess historians speculate about what he might have achieved had he continued to play.

Despite his brief career, Morphy's contributions to chess have been significant. He was one of the first players to emphasize the importance of developing the pieces in the opening, and he introduced several new attacking ideas that are still used by chess players today. Morphy's games continue to be studied and admired by chess players around the world, and he remains one of the most celebrated figures in the history of the game.


Talk about Paul Morphy in the style of Jordan Peterson

Well, Paul Morphy was a fascinating figure, and his story speaks to many of the fundamental aspects of human nature that I often discuss in my work.

Morphy was a brilliant chess player, and his success can be attributed in part to his extraordinary talent and dedication. But I believe that there were deeper psychological factors at play that contributed to his greatness.

Morphy was deeply curious and passionate about the game of chess, and he devoted himself to mastering its intricacies and strategies. He was also a natural risk-taker, unafraid to take bold moves and make sacrifices in pursuit of victory.

But I think what really sets Morphy apart is his ability to channel his emotions and intuition into his play. He was known for his aggressive style, which often involved sacrificing pieces in order to gain an advantage. This kind of bold, intuitive play requires a deep understanding of oneself and one's emotions, as well as the ability to channel those emotions into productive action.

Morphy's story also illustrates the importance of pursuing one's passions, even in the face of opposition or criticism. Morphy's family did not always support his interest in chess, and he faced significant challenges and setbacks throughout his career. But he remained committed to his love of the game, and ultimately achieved great success as a result.

In conclusion, Paul Morphy's life and career exemplify many of the themes that I often explore in my work - the importance of pursuing one's passions, the power of intuition and emotion, and the role of risk-taking and determination in achieving greatness. He was a remarkable figure who continues to inspire and fascinate people to this day.


Was Paul Morphy crazy?

There is no evidence to suggest that Paul Morphy was clinically insane or suffering from any kind of mental illness. While he did exhibit some eccentricities in his behavior, such as his tendency to pace and mutter to himself during games, these are not necessarily indicators of mental illness.

Some historians have suggested that Morphy may have suffered from obsessive-compulsive disorder (OCD), based on his reported need to touch each chess piece before making a move. However, this is purely speculative and there is no way to know for sure.

It's worth noting that Morphy lived in a time when mental illness was not well understood, and people with mental health issues were often stigmatized or marginalized. As a result, it's possible that Morphy's behavior may have been misinterpreted or unfairly judged by his contemporaries.

In any case, while Morphy was certainly a unique and sometimes eccentric figure, there is no reason to believe that he was insane or suffering from any kind of mental illness. Instead, he was a brilliant and passionate chess player who dedicated his life to mastering the game that he loved.


Was Bobby Fischer crazy?

Bobby Fischer was a legendary American chess player who achieved international fame for his historic victory over Boris Spassky in the 1972 World Chess Championship. Fischer's behavior in his later years was often erratic and controversial, leading some people to speculate that he may have been mentally ill.

It's difficult to say for certain whether or not Fischer was clinically insane or suffered from a mental illness, as he was never formally diagnosed. However, he did exhibit many symptoms of mental illness, including paranoia, delusions, and extreme mood swings.

Fischer's behavior became increasingly erratic in the years following his victory over Spassky, and he became known for his controversial public statements and bizarre behavior. He also became increasingly reclusive and isolated, cutting off contact with many of his friends and family members.

Some experts have suggested that Fischer may have suffered from a form of schizophrenia or bipolar disorder, which could explain his erratic behavior and extreme moods. However, this is purely speculative, and without a formal diagnosis, it's impossible to know for sure.

In any case, it's clear that Fischer's later years were marked by significant mental and emotional turmoil. While his contributions to the game of chess remain significant, it's important to remember that he was a complex and troubled individual who struggled with a range of personal and psychological challenges.

Wednesday, February 22, 2023

ChatGPT Generated Fictional Podcast with A.I Joe Rogan & Jordan Peterson

Once again we get an AI generated script and computer generated voices.

https://youtu.be/sVThCuT8j0A

Is this what the future holds? We might be watching our favorite actors, some long dead, simulated, with a never ending stream of AI created content. You could have Tom Hanks on TV 24/7, and the shows never repeat.

"In the next hour, Tom Hanks, Jimmy Stewart, and Charlie Chaplain take their air car on a grand tour of New Detroit. "

Best wishes,

John Coffey

Tuesday, January 31, 2023

Apple Working on 20.5-Inch All-Screen Foldable Notebook for 2025


it is still hard for me to get excited about tablet computers.  This is described as a notebook, so does that make it a laptop replacement?  If so, why would I need a laptop with no keyboard?

Sunday, January 29, 2023

BUY The M2 Mac mini! (It's basically a $599 Mac Pro...)

https://www.youtube.com/watch?v=PXHxY2i9w9Y

I'm a little frustrated. The $600 price is good, but the slower SSD, inadequate memory and storage are like a slap in the face. The excessive cost of upgrading is an insult. If they think that I am going to pay $200 for an additional 8 gigs of RAM or $200 for an additional 256 gigs of SDD, well count me out.
For the cost of the computer with upgrades, say an additional $400, I could get a really nice PC with more memory and storage.  
As good as the computer may be, I feel that it makes Apple look uncompetitive. In the year 2023 and beyond, having only 8 gigs of RAM or 256 gigs of storage are not valid options, unless you are just a basic computer user, in which case why are you buying an M2 chip? This mismatch between power, storage and memory is a paradoxical attempt to get you to spend more money.

Saturday, January 21, 2023

ChatGPT

john2001plus@gmail.com
What is Climate Alarmism?

Climate alarmism refers to the belief that the potential impacts of climate change, such as rising sea levels, extreme weather events, and loss of biodiversity, are so severe that immediate and drastic action is needed to address the issue. It is often used to describe those who are seen as exaggerating or overstating the potential impacts of climate change, and who may be viewed as overly alarmist or sensationalist in their messaging.




So far I am not as impressed with ChatGPT.  Its primary skill is to take web information available elsewhere and condense it into a few paragraphs of text.  

I asked it some machine code programming questions which it failed miserably even though it claims to know how to do this.


--

Friday, December 30, 2022

Avatar Review



"Avatar" is a science fiction epic directed by James Cameron and released in 2009. The movie tells the story of Jake Sully, a disabled former Marine who is sent to the distant planet of Pandora to participate in a corporate-funded project to mine a valuable mineral called unobtanium. While there, Jake falls in love with the native Pandora inhabitants, the Na'vi, and becomes torn between his loyalty to his human employers and his growing connection to the Pandora ecosystem and its inhabitants.

One of the standout elements of "Avatar" is its groundbreaking visuals and special effects, which were groundbreaking at the time of its release and hold up well even by today's standards. The use of motion capture technology and 3D animation allowed the filmmakers to create fully-realized, lifelike characters and breathtakingly realistic environments that are a joy to behold. The movie's action scenes are also well-choreographed and exciting, with impressive set pieces that showcase the unique creatures and landscapes of Pandora.

The story of "Avatar" is not particularly original, with elements of the "white savior" trope and a simplistic, good-versus-evil narrative that pits the human characters against the Na'vi. However, the movie's themes of environmentalism and cultural imperialism are timely and thought-provoking, and the performances of the cast, particularly Sam Worthington as Jake and Zoe Saldana as the Na'vi warrior Neytiri, are strong.

Overall, "Avatar" is a visually stunning and entertaining action adventure that is worth seeing for its groundbreaking special effects and strong performances. While its story may be somewhat simplistic and familiar, the movie's themes and visuals more than make up for it.

--
Best wishes,

John Coffey


P.S.  I didn't write this review.  An AI called chatGPT did.




Tuesday, December 20, 2022

What It's Like To be a Computer: An Interview with GPT-3

There is this thing called the Turing Test, invented by Alan Turing seventy years ago.  The idea is to see if a computer could become smart enough to fool a human into thinking he is talking to a real person.  We have crossed a threshold where computers have almost reached this point.

https://www.youtube.com/watch?v=PqbB07n_uQ4

The AI appears to understand more than it actually does.  It has studied human conversation and a mountain of raw information so that it can imitate a human conversation.

However, having a conversational computer isn't the only threshold the machines have crossed recently.  Computer AI has become much more useful, performing all kinds of new tasks, such as surgery or writing computer code.  By the end of the decade, machines will be performing many more jobs.  It is very likely that in the next couple of decades, or even in this one, we will have general-purpose robots that could perform any task that we want them to do.

Friday, December 16, 2022

The Current State of Windows on ARM-Architecture (& Its Promising Future)

The industry is moving toward efficient ARM processors, in an effort to catch up to Apple.  

https://www.youtube.com/watch?v=psbucvxF-UU&t=468s

This video is talking about a future processor.  I saw another video claiming that Microsoft's current ARM-based devices fall way short of what Apple's M-series chips can do.

Although the video claims that there is no problem running X86 programs on ARM processors using emulation (https://youtu.be/psbucvxF-UU?t=195), reportedly many videogames have difficulty or don't work at all under emulation.  However, if we are moving toward two competing hardware architectures, I hope that many software makers will compile their software to work on both platforms.  For video games, it can be more complicated because the GPU hardware can be different.

I don't think that Intel will stand still as their processors get out-competed.  All they would need to do is come up with more powerful processors to recapture the market.  They also are planning to come out with 3-nanometer chips in a couple of years.

However, ARM's main strength is power efficiency, so it will remain popular on portable devices.

According to one video, the industry is moving away from PCs as we know them today, and toward System on a Chip devices like what Apple has done.  Although these tend to be more efficient, you can't change the hardware configuration, like the memory size, after you buy them.



NintenDeen's Questioning if Donkey Kong Country 2 is the Greatest 2D Pla...

https://youtu.be/0F813wYuhVk

Tuesday, November 29, 2022

How Machine Language Works


I was fortunate that I got interested in computers really early, back in 1975, which was a couple of years before computers like the Apple II, Commodore Pet, and TRS-80 came out.  I was also fortunate that someone I met lent me a computer that he had built from a kit, which was an RCA Elf.  This computer was so primitive and had so little memory, only a quarter K of RAM, that you had to program it with a calculator-like keypad inserting numerical instructions into specific memory locations.  I was able to master this just enough to get a working knowledge of machine code programming.

There was a saying going back to this time period that if you knew how to program in machine code then you had a much deeper understanding of how computers work.  I learned several machine languages, and this proved very useful to me in getting jobs in the video game industry and being able to do those jobs.  When I went to work for Xanterra in 1999 to do Gameboy Color programming, I sort of hit the ground running because I already knew how to program Z80s, which I had learned in the 1970s.  The owner of the company was impressed enough with my skills that he gave me a raise after my first week.

https://www.youtube.com/watch?v=HWpi9n2H3kE

Sunday, November 20, 2022

iPad Pro M2: What Does "Pro" Even Mean?

https://www.youtube.com/watch?v=O_WbmIIy4vk

If you have a good smartphone, a tablet feels unnecessary.

The last thing I need is a tablet that is 2.5 times faster than my desktop computer.  This is the kind of power you want on a laptop or a desktop.

The M1 is only 7% slower than the M2.  

A couple of years ago, I bought the Amazon Fire tablet on Black Friday for $80, which is not a powerful tablet, but it works just fine as a portable Internet and streaming device.

4K Gamer Pro Review

https://www.youtube.com/watch?v=dL9U6n4IixQ

I did some experimenting on my computer by playing video games and videos at resolutions from 720P up to 5K. With my eyesight, which is about 20/30, on a 27-inch screen, I could not tell a difference between 1080P and higher. We are talking about levels of detail that are hard to perceive. I personally like 1440P, not that it was really an improvement over 1080P. It is maybe for me more psychological that I think that 1440P is better.

Even if you have 20/20 vision and a 60-inch screen, you are going to be sitting further back, whereas I sit very close to my 27-inch screen. Can people really tell a difference with 4K on a big-screen TV?

Many years ago somebody made a video about how 4K was unnecessary because the resolution of the human eye was not going to tell the difference. If it is unnecessary on a 55-inch TV, then it is probably unnecessary on a smartphone. I bought an iPhone 10R, which has a sub-1080P resolution, yet I never notice the resolution being too coarse.

Friday, November 18, 2022

Sprites



The word "sprite" is interesting. It means elf, fairy, or ghost, although it can also refer to flashes of different color lights in clouds caused by lightning. The word originated in the middle ages from the word "spirit". When I hear the word, I think of the Disney character Tinkerbell.

In computers and video games, a sprite is an image that can move on top of a background. Usually, these are 2D objects moving on top of a 2D background, although a game like the original Doom had 2D objects moving on top of a 3D background. The mouse pointer on a computer screen is technically a sprite.

Back in the days when computers and video games were 8-bit and 16-bit, it was helpful to have hardware support for sprites, which allowed graphical objects to move around independently of the background. The reason this was helpful was that it was more taxing for the old slow computers without hardware sprites to manipulate the graphics on the screen. When I was writing games for the Timex Sinclair 2068 and Atari ST computers, I had to write software to make all the graphics move because there was no hardware support for sprites, which makes the task more technically challenging.

The early arcade video games used hardware sprites and so did all early home video game consoles. The sprites on the Atari 2600 are extremely primitive and very difficult to program, but the programmers knew how to make them work.

Many people have touted the Commodore 64 as the best 8-bit computer because it had hardware support for eight 8x8 sprites, although this is not very many compared to the Nintendo Entertainment System that came out later. I think that the Atari 8-bit computer had better graphical capabilities overall.

Once we had 32-bit processors, there was no longer a need for hardware sprites. These systems were powerful enough that it was not a huge challenge to manipulate graphics on a screen. Also, with 32-bit systems, there was a greater emphasis on 3D graphics instead of 2D graphics.

--
Best wishes,

John Coffey

http://www.entertainmentjourney.com