Tuesday, November 23, 2021

Transistor count

Apple tops the transistor count in Microprocessors.

https://en.wikipedia.org/wiki/Transistor_count

Object-Oriented Programming is Bad

There aren't many people that I can talk to about computer programming.

https://www.youtube.com/watch?v=QM1iUe6IofM

In the 1970s, I learned that you write functions to avoid duplicating code.  The example most often given is a square root function.  You only need to write it once and call it from multiple places.

In the 1980s, I learned in school that you should break long sections of code into smaller easier to understand pieces by calling functions.  For example...

initializeGame()
playGame()
terminateGame()

This can make the code somewhat self-documenting.  I became a big fan of this style of programming, even while writing in assembly language, which is what I mostly used in the videogame industry.  In the late 1990s, one of my coworkers accused me of writing "spaghetti code" by doing this, although I still like this style.

I didn't learn about Object-Oriented Programming until the 1990s.  I had to use it with Visual C++, but I didn't do much with it, and I didn't feel comfortable with it.  Since then I have grown somewhat accustomed to it, but I never reached 100% comfort with it.  Initially, I believed that Object-Oriented Programming was only useful for Graphical User Interfaces, which is what it primarily was recommended for.

Reportedly, Microsoft was pushing Object-Oriented Programming in the 1990s.

I have found debugging objected-oriented code potentially a nightmare especially when dealing with inheritance.  In this case, it also feels like "spaghetti code."

I don't look at Objects as a style of programming, but as data structures that are occasionally useful.  If the code is very tightly bound to a specific set of data then putting it in an Object helps organize the code.  If you have multiple independent instances of a data structure, then the code is (possibly) cleaner if you put it in an Object. 

Not that I am a big fan of OOP.  In most cases, I don't find a compelling reason to use Objects.  I am glad to see a video that favors "Procedural Code".

It seems like people in the computer industry have for a long time been trying to deal with the issue of complexity.  Back in the 1980s people were pushing "Structured Programming."  Today, I don't even know what that is, but in the 1980s I found the buzzwords enticing.

Best wishes,

John Coffey

Also:   




Saturday, November 20, 2021

The Amazon Fire HD 10 is 50% off right now in Black Friday tablet deal

I'm going to start off by saying that I don't know why anybody needs a tablet.  Compared to smartphones with large 5.5. to 6.5 screens, tablets are bulkier and more difficult to take everywhere.  However, Amazon has $75 off their 2021 10.1 inch tablets, which is a heck of a nice deal on already budget tablets.  Last year during Black Friday I purchased the 2019 model for $80, and it is fine as a tablet.  My only complaint, besides not really needing a tablet, is that iPads, costing hundreds of dollars more, feel nicer to hold in the hands.

If you want more processing power and slightly more RAM, and you probably should because it will provide a better overall experience, then spend the extra $30 to get the "Plus" model, which is also $75 off.

Reviewers have complained about being limited to the Amazon ecosystem, with the Amazon store and Amazon software.  However, there are fairly easy ways to get around this, and there are videos on youtube showing how to install the Google store or how to turn the device into a regular Android tablet.

You might get much more value out of the tablet if you have an Amazon Prime membership.

https://www.tomsguide.com/deals/the-amazon-fire-hd-10-is-50-off-right-now-in-black-friday-tablet-deal

https://www.tomsguide.com/reviews/amazon-fire-hd-10-2021