Friday, December 20, 2024

Introducing the credit card sized Cardputer!!

https://youtube.com/shorts/bx4Mef6K23o?si=T0pCZ5QHuVPZLbyo

Our phones could be powerful computers. We just need easier ways to wirelessly access peripherals and cast to a screen. Or to another computer. Then your computer could be portable.

Saturday, December 7, 2024

Computer Intelligence

I have been making the same prediction for a couple of decades. We have seen big advances in computer intelligence, but not as quickly as I thought:

"Given current trends, it would seem pretty likely that within 10 years we could have robots that could perform just about any task you would want them to. They might not be affordable to the average person, but businesses could use them for things like mining, construction, etc. 10 years ago I was predicting that eventually robots would be able to build houses.

The technology is so close already. There have been some great advances in android type robots especially by the Japanese. They are already using robots to care for old people in nursing homes. (Something to look forward to?)

When this will happen is mostly a question of cost effectiveness. We all know that the cost of technology goes down over time.

Within 10 years it is likely that they could hold a conversation with you. Already there are robots have full facial expression.

This is probably true of your PC as well. Imagine a "conversation wizard" on your computer that asks "Where would you like to go today?" (Microsoft's old slogan.) What affect will this have on human interaction?

If you were to call up your bank, you might not know if you are talking to a real person.

Computer driven cars seem likely to me. There are already have cars that can parallel park themselves. I want one of those. :-)

We already have computer controlled airplanes that require no human intervention such as the Global Hawk. Newer commercial planes have an autopilot that if needed can completely take over and land the plane. Compared to driving a car, landing a plane is pretty mundane.

Consider the following: As computers get smarter, we will be gradually relinquishing more control to them. It will happen so slowly that we won't notice it. For example, maybe a computer would do all your investing. That seems realistic as computer controlled investing has happened on Wall Street for at least a decade.. I hope that somewhere along the way that we don't lose complete control. :-)"

Friday, November 1, 2024

History Brought to Life with Ai Magic Vol.1

https://www.youtube.com/watch?v=s_A5wwX52EY

First of all, it is astonishing what can be accomplished with AI.

At three minutes we see the future queen of England.  She earned my respect by being a truck driver and auto mechanic during World War II.


NEVER install these programs on your PC... EVER!!!

Thursday, October 3, 2024

Amazon.com: MINISFORUM Venus UM790 Pro Mini PC AMD Ryzen 9 7940HS up to 5.2 GHz 32 GB DDR5 1TB SSD with AMD Radeon 780M, 4X USB3.2, 2X USB4, 2xHDMI 2.1, 2X PCIe4.0, Wi-Fi 6E/BT5.3, RJ45 2.5 G : Electronics

I just happened to see this on Amazon.  This is the same computer I bought on sale a year ago.  I normally see it listed at $700.  Now it is $70 cheaper than what I paid for it.  (I don't know if an Amazon Prime membership is required.)

I can attest that this little computer is a beast.  It is about twice as fast as the Apple M chips unless you get the Pro or Max versions.  I wanted a powerful computer for chess analysis.  It can also play games.


https://www.amazon.com/MINISFORUM-7940HS-Radeon-USB3-2-PCIe4-0/dp/B0C7V29HBQ/ref=sw_ttl_d_wl_crc_4?_encoding=UTF8&pd_rd_i=B0C7V29HBQ&pd_rd_w=BAbtU&content-id=amzn1.sym.54f8919a-5929-48c6-8ece-4250c4138aca&pf_rd_p=54f8919a-5929-48c6-8ece-4250c4138aca&pf_rd_r=EVKW2S2ZYK5T8FX2F871&pd_rd_wg=rpV6v&pd_rd_r=eba6a46c-7f30-41ee-a717-b966d998c1fa&th=1

Friday, September 20, 2024

The Jetsons - 1950s Super Panavision 70

What you can generate with AI these days, is just astonishing.

https://www.youtube.com/watch?v=QZOnC8hdX8k

@frankpoperowitzmusic   2 days ago
I could watch two straight hours of Jane just strolling around the apartment.

Wednesday, September 18, 2024

Disney Plus: You paying more for less is their brilliant business strategy

https://www.youtube.com/watch?v=DppIlEH3GaU

Good video!

A Google search shows that Disney+ is offering three months for $1.99 per month.  This is the version with ads.

Last year on Black Friday I got a combination of Disney+ and Hulu for a year for $3 per month.  

There was a similar offer on HBO Max, now just called Max.  It is my favorite streaming service.

Years ago, I got the Disney+ pre-start discount where I was paying around $4 per month by paying for three years in advance. The problem is that I didn't see a ton of new content that I wanted to watch. I had already seen most of the stuff worth watching.

Sunday, September 15, 2024

How do Video Game Graphics Work?

I knew the basics, but most of this is quite advanced.

https://www.youtube.com/watch?v=C8YtdC8mxTU

The computational power required is enormous.  Graphics cards use hundreds of watts of power.  (My mini-computer has a processor with a TDP of only 55 watts.)

In the 1980s, I tried to do simple 3D graphics on 8-bit and 16-bit computers.   Since I had at most 16 colors to work with, no advanced shading was possible.

On the Timex Sinclair 1000, which was an 8-bit 16K back-and-white computer, I managed to create a low-res black-and-white 3D rotating object stored in the computer's memory.  I then could display the rotating object.  It was an impressive effect for such a simple machine.  

I accomplished similar effects on the more advanced Timex Sinclair 2068 color computer.  I also did some simple 3D effects on the Atari ST computer.

The 8-bit machines were capable of about 500 floating point calculations per second.  The Xbox Series X console is capable of 12 trillion per second.  The advanced graphics card featured in the video is capable of 33 trillion.

Thursday, August 29, 2024

Why Don't We Have 128 Bit CPUs?

https://www.youtube.com/watch?v=MKbNOmysJQo

This is something I wondered about.  

The fact is that 64 bits is large enough to hold almost any number we would need.  Even 32 bits is adequate. There might be special cases in science where a higher level of precision is required, for example, if you need to calculate a flight path to Pluto.  Still, current 64-bit computers can do 128-bit computations if needed but less efficiently than a 128-bit processor.

With the extra circuitry required for a 128-bit processor, the space on the CPU die is better spent having more processors.  So there is no reason for general-purpose CPUs to ever be 128-bit.

Graphics cards, which do a massive number of calculations every millisecond,  can have processors that use more than 64 bits so that they can move data to and from memory in larger chunks.  This is a special case.

Saturday, August 10, 2024

Beyond The Mind's Eye (Complete Film)

Back in the early 1990s, a series of films called "Beyond the Mind's Eye" featured the latest in stunning computer graphics.  Although it is dated today, I was impressed enough to buy the videotape.  The sequence from 13:55  to 15:49 impressed the hell out of me.

Friday, August 9, 2024

This New Super Nintendo Game Changes EVERYTHING For Retro Games...

They have taken a 30 year old Super Nintendo game that I did some work on, and added a more powerful coprocessor to the cartridge to make it play better. The original version used Nintendo's SFX Chip on the cartridge to create 3D graphics, but it wasn't powerful enough to do the game well.

https://youtu.be/8lor1zFo6e4?si=AwGtaneNHGsx7kIw

This New Super Nintendo Game Changes EVERYTHING For Retro Games...

https://youtu.be/8lor1zFo6e4?si=AwGtaneNHGsx7kIw

Sunday, May 12, 2024

The FREE ‘Never Obsolete’ PC from 2000! eMachines eTower 566ir


My mother had an E Machines computer.  I wasn't impressed by it, but at the time it was adequate for accessing the Internet.  However, if a person still has one of these old machines, there are many old video games that can be played on it.

The state of computers around the year 2000 was interesting.  People were buying their first computers to access the Internet.  Before this, I had abandoned my Frankenstein computer I had built up a piece at a time and purchased a 400 Mhz computer, which quickly became obsolete.  In the early 2000's I purchased a roughly 2 GHz single-core computer which seemed amazingly powerful at the time.  In 2005, I upgraded to a dual-core computer 2.4 GHz computer which also seemed like a big step up.

I purchased an i7 iMac in 2010 for around $2,000.  This was one of the more powerful computers on the market but it would seem sluggish by today's standards.  It had only slightly better performance than my old outdated laptop.   A couple of years ago I gave the almost dead computer to a repair shop for parts.

Tuesday, May 7, 2024

M4 iPad Pro Impressions: Well This is Awkward

The M4 iPad display has a 1000 NIT brightness that can go up to 1600 NITS. 

Considering that my computer monitor is plenty bright at 350 NITS, this seems like overkill.  It would be useful if you wanted to read the iPad in direct sunlight, but how many people need that feature?  My iPhone 10R is rated at around 600 NITS and it is reportedly readable in direct sunlight.

https://www.youtube.com/watch?v=-T0MGehwWvE

Plus it has a processor more powerful than all but a few desktop computers.  Overkill, which is why Apple wants to charge a thousand dollars for this thing.

The iPad is not a very useful device.  It is more limited than a computer, and less convenient than a smartphone.   

If I want a tablet I can get a lower-end iPad for around $300, and less on Black Friday.  I bought a Fire Tablet on Black Friday for around $80, and even though it is not a very powerful tablet, it can do basic internet tasks,  stream video, and play games.

Super Rare Arcade Game or Pinball Machine?

Re: Rare Golden Ship Galaga Glitch

Jeff,

1.  There is a bug in the original arcade Galaga where a player can get the "bees" to stop firing missiles.   Although this is part of the game, apparently it is considered a "cheat" and disallowed in high score records.  https://www.youtube.com/watch?v=dtYQB3JOFoc

If the player kills all the bees except for two on the left side and then doesn't kill them for 7 to 20 minutes, usually around 20 minutes, then all the bees will stop firing for the remainder of the game.  I have been able to reproduce this on my Arcade1up and my PC.

This works because there is a table in memory for all the missiles.  There might only be 8 entries in this table.  The table size limits how many missiles can be on the screen simultaneously.

A horizontal value of zero in the table means the missile is inactive.  The way the coordinates work in the game is that a horizontal value of zero is slightly off the left edge of the screen.

If you don't shoot the bees for 20 minutes, they will fire a bunch of missiles, and occasionally a few missiles will be created with a random horizontal coordinate of zero.  Even though these missiles are "active", the code ignores them.  Eventually, the table fills up with missiles that the code ignores.

I think that I heard that later versions of Galaga fixed this bug, although I am not sure.   I wish that they had kept it in.  It is a nice easter egg.


2.  Galaga has a method of doubling your firepower by letting your ship be captured and then rescuing it.  I usually avoid this because it at least doubles the vulnerability of losing a ship.  However, I just watched some YouTube videos where people going for a high score get double ships.  The way they survive and not lose one of their ships is to shoot most of the bees as they enter the screen.  The small number of bees that survive are easier to deal with.



3.  The video hardware has a large number of sprites that can be displayed on the screen simultaneously.  This was impressive for its time.  However, the number of sprites still has a limit.  

The game allows the player with double ships to fire up to four missiles at a time, but reportedly this was too many sprites.  So the game uses a single sprite to display two missiles, and four missiles are displayed with just two sprites.  

As a former video game programmer, I find this interesting because I am not sure what happens if just one of the two missiles in a single sprite hits an enemy.  Do both missiles disappear, or do both missiles keep going up the screen, or does the game switch to a sprite with just one missile?  It is hard to tell by watching the game because the usual case is that both missiles hit an enemy.  I am going to test this.


4.  I am also wondering how the NES port was able to display so many sprites.  The NES has a limit of 8 sprites on the same horizontal line, so if it exceeds this there might be some sprite flicker.  I am going to test this as well.

--


On Mon, May 6, 2024 at 11:03 PM Jeff Wires wrote:
I love Galaga and never knew this! This is just giving me more to talk about on the show!

Appreciate it!
-Jeff

On Sat, May 4, 2024 at 4:13 PM John Coffey wrote:
https://www.youtube.com/watch?v=t_58poXHxOg&t=877s

The golden ship reemerges when the game switches back to the second player.


The enemy ship switched to the wrong characters for that sprite but kept the same color palette.  If we assume that ship type is a single byte value, then that byte was somehow overwritten with a bad value, hypothetically a zero or a one which could be the value for the player ship.  Errors like this could be caused by a memory overflow where a value was written outside the bounds of a table or the limited stack memory was exceeded.

I learned that if Player 1 exceeds 999,999 points, it will not display the millions in the score.  It is not clear if the player's score still counts as being over a million, or if it goes back to zero.  However, the second player does not have this problem and his score can reach 8 digits.




Saturday, May 4, 2024

Rare Golden Ship Galaga Glitch

https://www.youtube.com/watch?v=t_58poXHxOg&t=877s

The golden ship reemerges when the game switches back to the second player.


The enemy ship switched to the wrong characters for that sprite but kept the same color palette.  If we assume that ship type is a single byte value, then that byte was somehow overwritten with a bad value, hypothetically a zero or a one which could be the value for the player ship.  Errors like this could be caused by a memory overflow where a value was written outside the bounds of a table or the limited stack memory was exceeded.

I learned that if Player 1 exceeds 999,999 points, it will not display the millions in the score.  It is not clear if the player's score still counts as being over a million, or if it goes back to zero.  However, the second player does not have this problem and his score can reach 8 digits.

Wednesday, April 24, 2024

Re: Response to "The 6502 CPU Powered a Whole Generation!'

For some reason, Gmail sometimes destroys my formatting.  This was a problem with cutting and pasting.

Here is a corrected version:



 https://www.youtube.com/watch?v=acUH4lWe2NQ

I programmed the SNES, Sony PlayStation, and Gameboy Color for a couple of major video game developers. My experience with Z80 and 6502 programming goes back to the 1970s.

I love how efficient the 6502 processor is. I have no doubt that for 8-bit operations it is twice as fast as a Z80. The first chess computers were 4 Mhz Z80s, but manufacturers switched to the 6502, starting at 2, 3, and 4 MHz, and eventually going up to 6 MHz.

The Z80 has almost twice as many transistors as the 6502, but the 6502 was wisely designed to be fast and cheap.

However, I have to cry foul when it is stated that the Z80 takes an average of 13 clock cycles per instruction.

The Z80 has extra capabilities like 16-bit math and loads. When we compare 8-bit to 8-bit then the 6502 is at most twice as fast.

The Z80 index registers are very inefficient so I never used these registers.

Eight-bit load instructions take 4 to 7 clock cycles depending on the instruction type. Indexed 8-bit loads take a whopping 19 clock cycles.

16-bit loads take 10 to 20 clock cycles depending on the instruction type.

Eight-bit math takes 4 to 7 clock cycles depending upon the type of instruction. Incrementing or decrementing a memory location takes 11 clock cycles, and once again using the inefficient index registers takes 22 clock cycles.

Sixteen-bit math takes 6 to 11 clock cycles, and the indexed registers take 15 clock cycles.

Bit shifting the accumulator takes 4 clock cycles. Bit shifting other registers takes 8 clock cycles. Bit-shifting memory takes 14 clock cycles.

The block memory copy instruction takes a whopping 22 cycles per byte. For copying the same bytes into a block of memory, one can hijack the SP for an average of 5.5 clock cycles per byte not counting control instructions. (There are efficient ways to write this.)


I take issue with your criticism of the Timex Sinclair 2068 computer. I am one of many people who are still fans of this computer. This has much to do with the cost of the computer. The C64 cost $595 when released, which is about $1900 today adjusted for inflation. There was absolutely no way I could afford a C64, but the 2068 was actually a pretty good computer for $200 which I could afford. For me, it was a choice of the 2068 or no computer at all. (You claimed that the C64 was also selling for $200, but this is not what I saw back in 1983. It was hundreds of dollars more. You pointed to an ad that listed the C64 for $200, but I looked at the fine print which showed it was $400 with free software supposedly worth $200.)

The 2068 had a better BASIC with a 24K ROM and additional instructions for accessing graphics and sound.

The C64 definitely had better graphics capabilities for games with sprites and smooth scrolling. However, it was still possible to write games for the 2068. I wrote and self-published a Boulder Dash clone for the 2068 called "Diamond Mike". There are videos of it here on YouTube.

If I were comparing only the processors, I would choose a 3.58 Mhz Z80 over a 1 Mhz 6502. Obviously, I would prefer a faster 6502. I very much enjoyed programming on the SNES with the 3.58 Mhz 65C816. (The reason the speed 3.58 Mhz was used for multiple computers and video game systems is it is 1/4 the speed of the color signal crystal. They used one crystal to drive both things as a cost-savings measure.)

Best wishes,

John Coffey

Response to "The 6502 CPU Powered a Whole Generation!'


I programmed the SNES, Sony PlayStation, and Gameboy Color for a couple of major video game developers. My experience with Z80 and 6502 programming goes back to the 1970s. I love how efficient the 6502 processor is. I have no doubt that for 8-bit operations it is twice as fast as a Z80. The first chess computers were 4 Mhz Z80s, but manufacturers switched to the 6502, starting at 2, 3, and 4 MHz, and eventually going up to 6 MHz. The Z80 has almost twice as many transistors as the 6502, but the 6502 was wisely designed to be fast and cheap. However, I have to cry foul when it is stated that the Z80 takes an average of 13 clock cycles per instruction. The Z80 has extra capabilities like 16-bit math and loads. When we compare 8-bit to 8-bit then the 6502 is at most twice as fast. The Z80 index registers are very inefficient so I never used these registers. Eight-bit load instructions take 4 to 7 clock cycles depending on the instruction type. Indexed 8-bit loads take a whopping 19 clock cycles. 16-bit loads take 10 to 20 clock cycles depending on the instruction type. Eight-bit math takes 4 to 7 clock cycles depending upon the type of instruction. Incrementing or decrementing a memory location takes 11 clock cycles, and once again using the inefficient index registers takes 22 clock cycles. Sixteen-bit math takes 6 to 11 clock cycles, and the indexed registers take 15 clock cycles. Bit shifting the accumulator takes 4 clock cycles. Bit shifting other registers takes 8 clock cycles. Bit-shifting memory takes 14 clock cycles. The block memory copy instruction takes a whopping 22 cycles per byte. For copying the same bytes into a block of memory, one can hijack the SP for an average of 5.5 clock cycles per byte not counting control instructions. (There are efficient ways to write this.) I take issue with your criticism of the Timex Sinclair 2068 computer. I am one of many people who are still fans of this computer. This has much to do with the cost of the computer. The C64 cost $595 when released, which is about $1900 today adjusted for inflation. There was absolutely no way I could afford a C64, but the 2068 was actually a pretty good computer for $200 which I could afford. For me, it was a choice of the 2068 or no computer at all. (You claimed that the C64 was also selling for $200, but this is not what I saw back in 1983. It was hundreds of dollars more. You pointed to an ad that listed the C64 for $200, but I looked at the fine print which showed it was $400 with free software supposedly worth $200.) The 2068 had a better BASIC with a 24K ROM and additional instructions for accessing graphics and sound. The C64 definitely had better graphics capabilities for games with sprites and smooth scrolling. However, it was still possible to write games for the 2068. I wrote and self-published a Boulder Dash clone for the 2068 called "Diamond Mike". There are videos of it here on YouTube. If I were comparing only the processors, I would choose a 3.58 Mhz Z80 over a 1 Mhz 6502. Obviously, I would prefer a faster 6502. I very much enjoyed programming on the SNES with the 3.58 Mhz 65C816. (The reason the speed 3.58 Mhz was used for multiple computers and video game systems is it is 1/4 the speed of the color signal crystal. They used one crystal to drive both things as a cost-savings measure.)

Saturday, April 20, 2024

Apple's Silicon Magic Is Over!

https://youtu.be/AOlXmv9EiPo?si=j_EAv_ahdgcmzEvI

Chips are reaching the physical limits of what is possible. According to a video I watched today, an experimental 1 nm process only has 1 atom per transistor. That doesn't sound right to me, but if true, then you can't go smaller than that.

There might still be some room for improvement, but smaller circuits tend to fail because quantum effects cause electrons to jump between circuits.

Someday computers might have circuits that use light instead of electricity, but right now the technology is not even close.

What would it mean if computers can't get much faster? If you need more computing power then you would need more chips, or bigger ones, that either way consume more energy.

However, it feels like computers are plenty powerful right now. It is not like the general public has a strong need for more powerful chips.

Apple already has good competition from AMD. My mini computer is twice as powerful as the Apple M2 chip.

Sunday, April 14, 2024

A Look Inside Apple's $130 USB-C Cable

https://www.youtube.com/watch?v=AD5aAd8Oy84

Apple sells things at a higher margin than other companies.  Although this is a superior cable for data transmission, maybe competitors will offer comparable products.

Saturday, March 30, 2024

How Sinclair Spectrum games handled color.

The Spectrum computer could only display one foreground and background color for every 8x8 pixel square.

Many games would have multicolored backgrounds and a single color, maybe black, foreground.  Then "sprites" would be black and the same color as the background.  It works, but looks hokey showing just how inferior the color capabilities of the Spectrum are.  It makes it look like you are playing a black-and-white game with a color overlay on the screen.

When the computer was released, it competed against computers like the Apple II and the Atari 8-bit.  The C64 would come out 6 months later.

 

The 2068 had three extra graphics modes if you count frame swapping as one mode.  Nobody used the extra graphics modes, as most games were ports from the Spectrum.  

I wanted to do great things with the extended color mode, which allowed two colors for every 8x1 pixels.

Why is AI so bad at spelling? Because image generators aren’t actually reading text | TechCrunch

Friday, March 15, 2024

The Game of Risk - Numberphile

https://www.youtube.com/watch?v=RdooKXXcWWc

I wrote a Risk AI for the Sega Genesis in the mid-90s.  My algorithm was brute force trying to look a few moves ahead.  It was computationally intensive and evaluated moves on the chances of success.  This resulted in an aggressive approach that wanted to move a large army attacking countries one at a time.

The lead programmer didn't like my AI and wanted to use his own.  I convinced his manager that my algorithm was winning and he told the lead programmer to use my code.  However, the lead programmer used his own AI instead.

Your Internet is Too Fast


It was costing me around $90 a month for gigabit Internet service.  So I downgraded to 200 Mbit, for about $35 per month, and it is plenty fast for me.

Wednesday, March 13, 2024

68000 processor vs. 65C816 processor

The 68000 was by far the easiest assembly language to program. The instruction set is HUGE, compared to the 6502 which has a small instruction set. As I recall, the 68000 allows you to have 8, 16, and 32-bit operations in both moving data and math and has a bunch of internal registers.

For me, programming the 68000 didn't feel much different from writing code in C.

However, the 68000 takes more clock cycles to execute its instructions. If we compare systems like the original Apple McIntosh vs the Apple II GS, or the Sega Genesis vs. the SNES, in both cases you have a roughly 8 Mhz 68000 versus a 2.8 or 3.58 Mhz 65816. The 65816 only has an 8-bit data bus but can execute instructions faster. Also, the original McIntosh used about 37% of its CPU time to drive the display.

The 68000 is superior, but the 65816 is competitive with it. It is like CISC vs. RISC.

Apple deliberately limited the speed of the Apple II GS because they didn't want it to compete with the more expensive McIntosh. The chip can go up to 14 Mhz. Some people stated that the Apple II GS was the computer that Apple should have made instead of the McIntosh. It was cheaper, ran a graphical user interface, and was backward compatible with the Apple II.

Tuesday, March 12, 2024

Do Hard Drives Vibrate?

I noticed that when I put my hand on my portable 5TB hard drive, it has a slight vibration.  I didn't notice this before, so I was worried about it.

Google's AI tells me the following...

A small amount of vibration is normal for spinning drives. When in use, a hard drive's platter spins at 7,200 RPM, and its actuators and heads move to read and write data. This movement can cause a slight vibration in the X-Y plane. A healthy drive also makes mild, regular sounds of whirring. 
Here are some other things to consider about hard drive vibrations:
  • Rattling noises
    In most cases, rattling noises are normal and are no reason to worry. They are caused by mechanical parts of the HDD and just indicate that the drive is reading, writing or calibrating itself.
  • Shock resistance
    At average HDD shock resistance is 50+ G while operating and 300+ G while non-operating.
  • Driving a desktop hard drive
    Some say that you should not drive or expose a desktop hard drive to vibrations when powered on


Saturday, February 24, 2024

Exciting decade for computer chips


In the 2010s computers advanced at a snail's pace. Sometimes people got excited when chips were 10% faster from one year to the next.

However, this decade has been dramatically different and it has everything to do with the circuit sizes on those chips. My 2009 iMac that died had a 42-nanometer chip. My 2017 iMac has a 14-nanometer chip. My new mini-computer has a 4-nanometer chip and it is very powerful for something not much bigger than a Whopper sandwich.

In late 2019 both Microsoft and Sony released their latest generation of video game consoles using 7-nanometer chips. Since these have APUs with the graphics "card" built into the main processor, I wanted something like this in a computer.  I waited almost 4 years to be able to get something similar in a mini-desktop PC.

These smaller circuits are not cheap to make The equipment to make them can cost billions. This is why Taiwan Semiconductor, which made the investment, is the number one manufacturer of these chips.

But it has resulted in a paradigm shift where people realize that they don't have to spend a fortune on graphics cards to play games. The APUs are not as good as a $1200 gaming PC, but they are good enough.

This also has resulted in hand-held gaming systems that are as powerful as a computer and can also be used as a computer when hooked up to a monitor, keyboard, and mouse.

Even the most recent iPhone can double as a gaming system.

Reportedly, Sony, Nintendo, and Microsoft are all coming out with more powerful gaming systems in the coming year. This has been an exciting decade for computer chips.

--
Best wishes,

John Coffey


Tuesday, February 20, 2024

Who The Hell Asked For A PS5 Pro?

https://www.youtube.com/watch?v=k--psgxdUJw

I've been saying the same thing.  I also bought a powerful mini-PC that can play games.  All of this change, including the consoles and the handhelds, has happened because of AMD's new powerful energy-efficient chips.  Intel has chips too, but they are not quite as good.

The PS5 and the XBox Series X consoles are 10.28 and 12 teraflop machines respectively.  That is still a great deal of performance and you would have to spend a lot on a PC to match that.

Apple is also starting to push gaming on the most expensive iPhones.

I have no faith in Cloud Gaming, but this may eventually be the future where you can run triple-A games on any device with a screen.

Saturday, February 17, 2024

Why Everyone is Wrong about the Apple Vision Pro (including me)



@john2001plus
0 seconds ago
I don't need to wear a heavy screen on my head when I have a house full of more screens than I actually use.  Two desktops and an iPhone are enough, but I also have a TV and a laptop that I rarely use, and two tablets I don't use.  I also have a handheld game system that doesn't get much use.

Thursday, February 8, 2024

USB-C Tutorial for Everybody

The bottom line is that when buying a USB-C cable, make sure that it supports the data speed that you want.  The same thing applies to power delivery for fast charging.

https://www.youtube.com/watch?v=qV03FfdPHOw&t=1308s

Friday, February 2, 2024

Using Apple Vision Pro: What It’s Actually Like!

I find this interesting. I can't imagine it being more than a novelty for now. For it to be anything more than a novelty, it is going to have to immerse you into another world.

Saturday, January 27, 2024

34 *AWESOME* Atari 2600 Homebrews! Episode 1!

https://www.youtube.com/watch?v=YMzkgTKJm84

It is hard to believe that these are games for the 2600, which only had 160 bytes of RAM.

A few 2600 games had extra RAM or other chips to give better graphics or sound. I suspect that the emulators can handle the extra hardware.

Thursday, January 11, 2024

WARNING: ChatGPT Could Be The Start Of The End! Sam Harris

Sam Harris spends 30 minutes talking about the dangers of AI.

https://www.youtube.com/watch?v=GmlrEgLGozw

He makes assumptions about the future. I think that he underestimates the difficulty of building a general AI.  

I think that ChatGPT is overhyped.  It is like a Wikipedia that can talk.  It has no understanding except to predict what words should follow other words based on statistical information.  This is why it gets so much wrong.  I asked it to write some computer code and the answer wasn't even remotely correct.  

We will inevitably develop general AI, but AI is a tool to solve specific problems.  We don't have to make an AI that matches human intelligence when it is more efficient to have problem-specific AI.  Calculators can do math far better than I can, and even the best 8-bit chess computers can outplay me at chess.  It would be like saying that when we developed mechanical locomotion, we needed to make a machine that functioned exactly like a horse.  We found better ways to do locomotion.

This means that AI will be solving problems long before we have a general AI, but more importantly, we will be treating it as a tool, just like any other tool.  For example, twenty years ago I was annoyed when Microsoft Word automatically corrected my spelling without asking me.  It felt like the machines were already becoming smarter than us.  Although that was a novel experience twenty years ago, we wouldn't think twice about it today.