Friday, January 14, 2022

The New Snapdragon 8 Gen 1

Computing processing power matters to me because I do many processor-intensive tasks.  When I bought a 2009 iMac with a Core-i7 860 processor, it was one of the fastest computers you could buy.  Today it gets stomped by most of the computers on the market.

The previous decade was a period with only marginal advancement in computer microprocessor power.  People were bragging about 10 and 20% improvements.  However, more developments were being made in mobile processors, especially with Apple.

Nevertheless, since 2020 we have seen some amazing progress.

This might not matter to most people, but the latest and greatest Android smartphone processor is 25% more powerful than my 2017 desktop iMac running a 3.4 GHz i5-7500 with active cooling.

https://www.youtube.com/watch?v=syJ3xn4q9xo&t=80s

My 2017 computer has a 959 Geekbench Single-Core Score and a 3072 Multi-Core Score.

The Apple A15 Bionic found on the iPhone 13 received a single-core score of 1732 and a multi-core score of 4685.  

The M1 chip used in the latest Apple laptops has scores of 1717 and 7328.  The M1-max chip in Apple's new desktop computers has scores of 1749 and 11542.

I have no interest in buying another Apple computer, but I am impressed with their products.  It is only an amount of time before the competition catches up.  

However, I am interested in the AMD Rizen 7000 processors that will be released in the second half of this year.  This will be the first time that all the new AMD processors will have built-in graphics, possibly eliminating the need to buy a separate graphics card.

Monday, December 13, 2021

The Matrix Movies vs The Matrix Unreal Engine 5 Experience Comparison

A video game engine can pretty much simulate reality.

https://youtu.be/CpH_bu0s7C8

The first Unreal Engine was for a video game called Unreal in the late 90's. I played the game just a a little. The graphics were very crude compared to today, but it was a step up from what we used to.

Best wishes,

John Coffey

Sunday, December 12, 2021

Doom (video game)

 


The 1993 videogame Doom practically invented the First Person Shooter, although that really started with its predecessor Wolfenstein 3D.  

It is still my favorite videogame.  I had the privilege of contributing to the Super Nintendo version of Doom.  My name is in the credits.  However, the Super Nintendo version is not as good as the PC version. (See the video at 32:45.)

I haven't played many newer First Person Shooters, which means that I'm at least a couple of decades out of date.  

So I find myself wondering how you can best a perfect game like Doom?  All First Person Shooters have their roots in Doom.  Newer games have better graphics and more story.

Sunday, December 5, 2021

Is Your iPad Obsolete and Outdated?

My iPad 4 was a serious investment. I don't think that I got $400 value out of it. There are many apps that it will no longer run, so I feel abandoned by Apple.

My $75 Fire HD 10 inch tablet is almost as powerful and runs everything I have tried.

The situation is far worse with the $200 Microsoft Surface tablet that I purchased 10 years ago. It was very underwhelming to begin with, and now it will run next to nothing. There is an online support group of people who for some strange reason are still fans of this tablet.

I've been arguing that people don't need tablets if they have a good smartphone.

https://www.lifewire.com/obsolete-ipad-4138570

Tuesday, November 23, 2021

Transistor count

Apple tops the transistor count in Microprocessors.

https://en.wikipedia.org/wiki/Transistor_count

Object-Oriented Programming is Bad

There aren't many people that I can talk to about computer programming.

https://www.youtube.com/watch?v=QM1iUe6IofM

In the 1970s, I learned that you write functions to avoid duplicating code.  The example most often given is a square root function.  You only need to write it once and call it from multiple places.

In the 1980s, I learned in school that you should break long sections of code into smaller easier to understand pieces by calling functions.  For example...

initializeGame()
playGame()
terminateGame()

This can make the code somewhat self-documenting.  I became a big fan of this style of programming, even while writing in assembly language, which is what I mostly used in the videogame industry.  In the late 1990s, one of my coworkers accused me of writing "spaghetti code" by doing this, although I still like this style.

I didn't learn about Object-Oriented Programming until the 1990s.  I had to use it with Visual C++, but I didn't do much with it, and I didn't feel comfortable with it.  Since then I have grown somewhat accustomed to it, but I never reached 100% comfort with it.  Initially, I believed that Object-Oriented Programming was only useful for Graphical User Interfaces, which is what it primarily was recommended for.

Reportedly, Microsoft was pushing Object-Oriented Programming in the 1990s.

I have found debugging objected-oriented code potentially a nightmare especially when dealing with inheritance.  In this case, it also feels like "spaghetti code."

I don't look at Objects as a style of programming, but as data structures that are occasionally useful.  If the code is very tightly bound to a specific set of data then putting it in an Object helps organize the code.  If you have multiple independent instances of a data structure, then the code is (possibly) cleaner if you put it in an Object. 

Not that I am a big fan of OOP.  In most cases, I don't find a compelling reason to use Objects.  I am glad to see a video that favors "Procedural Code".

It seems like people in the computer industry have for a long time been trying to deal with the issue of complexity.  Back in the 1980s people were pushing "Structured Programming."  Today, I don't even know what that is, but in the 1980s I found the buzzwords enticing.

Best wishes,

John Coffey

Also:   




Saturday, November 20, 2021

The Amazon Fire HD 10 is 50% off right now in Black Friday tablet deal

I'm going to start off by saying that I don't know why anybody needs a tablet.  Compared to smartphones with large 5.5. to 6.5 screens, tablets are bulkier and more difficult to take everywhere.  However, Amazon has $75 off their 2021 10.1 inch tablets, which is a heck of a nice deal on already budget tablets.  Last year during Black Friday I purchased the 2019 model for $80, and it is fine as a tablet.  My only complaint, besides not really needing a tablet, is that iPads, costing hundreds of dollars more, feel nicer to hold in the hands.

If you want more processing power and slightly more RAM, and you probably should because it will provide a better overall experience, then spend the extra $30 to get the "Plus" model, which is also $75 off.

Reviewers have complained about being limited to the Amazon ecosystem, with the Amazon store and Amazon software.  However, there are fairly easy ways to get around this, and there are videos on youtube showing how to install the Google store or how to turn the device into a regular Android tablet.

You might get much more value out of the tablet if you have an Amazon Prime membership.

https://www.tomsguide.com/deals/the-amazon-fire-hd-10-is-50-off-right-now-in-black-friday-tablet-deal

https://www.tomsguide.com/reviews/amazon-fire-hd-10-2021

Saturday, October 23, 2021

"Game Development in Eight Bits" by Kevin Zurawel

I find this very interesting.  When I was programming for the SNES, we used the same techniques, but the SNES has 192KB of RAM and up to 1 to 4 megs of ROM.  Some cartridges were much smaller.

https://www.youtube.com/watch?v=TPbroUDHG0s

Wednesday, October 20, 2021

Apple Took All My Money

https://www.youtube.com/watch?v=PNUQ2o-wiL8&t=607s

Not personally interested in laptops, but I am impressed with the progress Apple has made with its custom processors.  Apple was the first company to introduce a 5-nanometer processor.  I'm waiting for AMD and Intel to catch up.  Until recently, Intel was struggling to go from 14-nanometers to 10-nanometers.

The biggest chip manufacturer is in Tawain.  It has been reported that Intel has contracted for 100% of the not yet available 3-nanometer chip production.  In other words, everyone else is out of luck and would have to look elsewhere to produce faster chips.

Tuesday, October 12, 2021

Neat AI does Lenia - Conway's game of life arrives in the 21st century

The "Game of Life" is not an actual game, but a computer simulation invented around 50 years ago.  It was one of the first things I learned about computers.  It follows a couple of simple rules that create interesting self-propagating patterns.

Apparently, someone has taken this to a much more advanced level.




Monday, September 20, 2021

Did Apple Just Prove the iPhone Could be Cheaper?

More than one person has pointed out that the new iPad Mini is cheaper than the new iPhones, with essentially the same hardware.

https://www.youtube.com/watch?v=jPidIspifRM&t=837s

Tuesday, August 31, 2021

Streaming videogames



Roughly 32 years ago I had an argument with a coworker.  He argued that once internet speeds became fast enough to transmit full-screen video, we wouldn't need game consoles, since we would be able to stream video games from a server to our computer screens.  Rather than pay for expensive hardware, that hardware could be on a server someplace, saving us money.

I have to admit that he had remarkable foresight literally 30 years ahead of his time.  This was at a time when the Internet was text only.  However, I saw a number of problems with his idea...

1.  Internet speeds were still fairly low, like 1,200 to 2,400 bits per second.

2.  Latency is always an issue when playing games.  No matter fast your Internet is, there is an overhead to transmitting data back and forth.  

3.  It is always advantageous to have your own hardware.  Imagine having to share hardware with other people competing for the same physical resources.  I figured that hardware would get cheaper over time, eliminating the need to share hardware with other people.

4.  His idea reminded me of the early days of computing where you would have to dial into a mainframe using a dumb terminal, one of which I actually owned and used at the time, whereas the new trend in computing was for everyone to have their own computer.

I argued that streaming video games would never be practical.  He couldn't understand why I didn't see the obvious wisdom of his idea.

Two years ago Google introduced Stadia, which was a video game streaming service, and it totally flopped.  Other companies like Microsoft and Amazon are working on the same idea, but they all suffer from the same problems like latency. 

It makes very little sense to be dependent on unreliable Internet communication and shared hardware to play games when you can purchase a video game console like the Xbox Series X for $500.  Putting hardware in a centralized location instead of your living room isn't necessarily cheaper, except that you can share that hardware with other people, but what if you all want to use the hardware at the same time?

In theory, this could become practical someday, but the same technology that will make this more feasible will also make it more feasible for you to have your own hardware that is just as good.  This is the problem I saw three decades ago.



Apple Store vs. Repair Shop: What the Right to Repair Is All About

How THIS wallpaper kills your phone.

How are we going to do this?

He is getting faster internet across 6 kilometers of water than what I get across my family room.

I'm running a 5 GHZ wi-fi router, which is around double the frequency of your microwave oven, probably at 100 milliwatts.  He is running a 60 GHz transmitter, which is very high microwaves, at an unknown power.

https://www.youtube.com/watch?v=9T98VsMe3oo


My mother and step-dad couldn't get internet to work 3 miles outside of North Vernon, Indiana, using Verizon Wireless as the provider.  Fortunately, they now have fiber internet, although at a low speed of 4 to 5 Mbs.

Monday, August 30, 2021

Now Games Can Look Like Pixar Movies - Unreal Engine 5

The bottom line is that the Unreal Engine allows massively detailed images to be generated in real-time for either games, or like on The Mandalorian TV series where they use a dome with screens on all sides to generate the environment for the actors to act in.  (https://techcrunch.com/2020/02/20/how-the-mandalorian-and-ilm-invisibly-reinvented-film-and-tv-production/)

https://www.youtube.com/watch?v=47I2N_l47mw

The first version of the Unreal Engine was used to make a videogame in the late '90s called "Unreal".  I played this game.  By today's standards, it was very crude, but it was actually a step up from what we were used to at the time.

Monday, August 9, 2021

An honest conversation on Apple, hashing, & privacy with Daniel Smullen

https://youtu.be/9ZZ5erGSKgs

How is Apple examining the data on my phone any different than Wiretapping?

It is a federal crime to wiretap or to use a machine to capture the communications of others without court approval, unless one of the parties has given their prior consent. It is likewise a federal crime to use or disclose any information acquired by illegal wiretapping or electronic eavesdropping.

Thursday, August 5, 2021

Apple is about to start scanning iPhone users' devices for banned content, warns professor • The Register

https://www.theregister.com/2021/08/05/apple_csam_scanning/?td=keepreading-btm

Apple claims to protect its customer's privacy.  Regardles of how good the cause is, I don't want them looking at my photographs.  It also shouldn't be their job to act as police.

Sunday, July 18, 2021

Green Screen special effects

I saw a shorter version of this on Facebook that starts 38 seconds into it.  I found it impressive.

https://youtu.be/FFJ_THGj72U?t=38