Saturday, June 22, 2019

iMac

When I purchased a top of the line iMac in March 2010, I paid quite a lot of money for it, but my rationale was that it should last a long time, preferably till the end of the decade.

I tend to be a heavy user. I do tasks that are processor intensive. They used to say that computers were meant to be left on all the time, so I left mine on all the time. This is probably bad advice. The chickens have come home to roost. I recently paid to have the iMac cleaned out because it was full of dust and badly overheating. (This is a problem with all-in-one-computers, plus they tend to overheat anyway. Laptops can also have problems with overheating.) However, I have 4 major components either failing or about to fail, and the high cost of repair makes it look like it is time to get a different computer.

  

--

Monday, June 10, 2019

iMacs and all-in-one-computers

The problem with all-in-one-computers is that if something goes wrong then you have to fix the entire computer.

My current problem is that the display shuts off.  If I had a standard desktop tower and the display was quitting on me, I would just buy a new monitor.

My previous problem was that the computer was overheating because it was clogged with 9 years of dust,  First of all, you aren't likely to have this problem on a regular tower computer, but if you did, it is far easier to get to the internals to clean them out.

--

Thursday, May 23, 2019

Incredible Retrobrighting Discovery

Old computers and videogames use a cheap type of plastic that yellows over time.  This is a problem for collectors.  I have seen videos where people go to very elaborate means to disassemble the devices and soak the plastics in hydrogen peroxide along with other chemicals, which is expensive.

However, this video claims that there is a much easier method.

https://www.youtube.com/watch?v=8P1OVj0IcqY

Saturday, May 11, 2019

Intel processors.

I have learned that every Intel microprocessor made in the last 15 years has another processor on it running its own operating system called Minix. This is quite surprising. The extra processor is responsible for managing the main processor, which seems very odd to me. This extra processor has also been the cause of security problems that they had to fix.  

--

Tuesday, April 30, 2019

Ryzen Threadripper - AMD - WikiChip

My 2009 iMac has a first-generation Core-i7 chip that has 42-nanometer circuits.  I just bought a refurbished laptop with a Core-I5 that has 35-nanometer circuits, and it performs 80% as well as the desktop even though it has half as many cores.

This is an interesting website about microprocessors.  I don't think that I would want to buy a processor with a 10 to 14-nanometer process when 7-nanometer processors are coming from AMD in August.  I find it interesting that Apple introduced 7-nanometer chips in their phones many months before either major processor company came out with 7-nanometer processors.  I also find it interesting that both the new Play Station and the new XBOX will have 7-nanometer processors from AMD.

In terms of the laws physics, it is almost impossible to get much smaller than this, although I have heard talk of a 5-nanometer chip.  I also heard about a possible 1-nanometer chip using different materials, but the technology is a long way off, and I have my doubts about how well it could work.

https://en.wikichip.org/wiki/amd/ryzen_threadripper

Sunday, March 3, 2019

The future of humanity

I see a danger to the future existence of the human race, and it is the kind of thing that people should think about and prepare for now. Sometime in the next 50 years machines will be smarter than people. There are major technical hurdles to overcome, such as the inevitable end of Moore's Law, which probably mean that it is not right around the corner or even within the next couple of decades, but it will happen, and easily within this century. And if for some reason it does happen within the next couple of decades then that means the results will be upon us that much sooner.

We can predict what will happen next and follow it to its logical conclusion, which is a future without people.

As machines become smarter, people will become increasingly reliant on technology. We can see that already with smartphones, which only have been with us for barely over a decade. Eventually machines will do all the heavy mental work, which will make our lives easier, but also make us more dependent.

And since we will be so dependent on the machines, we will start incorporating them into us. This will evolve over time until we are no longer purely human, but human machine hybrids. Perhaps when your biological brain dies, the machine part of you will be able to continue with all your memories intact. Maybe it would have an artificial body or maybe it will exist in a virtual world. It is likely that some would prefer to live in a virtual world where they can do more things than they could in the real world. Taken to the eventual extreme, our descendants would no longer bother with biological bodies and prefer to exist as machine intelligences either in the real world or in virtual ones.

The evolutionary pressure will be against purely biological people. Having machines incorporated into you will make you more productive, competitive, and increase your quality of life.

The future I describe might be long distant, but if it is not the future we want for the human race then we should start thinking about it now. Maybe we could have a Pure Human movement that would prohibit the merging of machine intelligence with human intelligence? This could be roughly analogous to the current legal ban on human cloning, because we very likely have the technology right now to clone humans, but countries ban it because they are uneasy about the implications of where that might take us.

However, we might not be able to prevent it. Linking machines with human intelligence is likely to happen in such small steps that we will easily adjust to it. It is sort of happening already with our dependence on computers. It could also start as a series of military applications where having the most effective soldiers determines who wins the wars. And once the genie is out of the bottle, we will never get it back in.

Best wishes,

John Coffey

Friday, February 22, 2019

Fwd: The Verge: Apple is reportedly closing two stores in a Texas district to avoid patent trolls

Apple is reportedly closing two stores in a Texas district to avoid patent trolls

The Verge

The Eastern District of Texas is notorious for patent cases, and Apple wants out Read the full story


Shared from Apple News


Wednesday, February 6, 2019

Saturday, January 26, 2019

AMD APU's

I have some interest in getting an APU at some future date with the idea of building a lower cost computer. An APU is a chip that has the processor and graphics "card" on one chip, and this can save quite a bit of money over buying a seperate graphics card. For the moment, if you want the best performance, you need a separate graphics card, because you can get both better processors and better graphics cards that way.

The AMD APU that comes with the XBOX One X is fairly impressive. The processor is not fantastic, but the graphics are really good. However, you can't buy this APU and put it in a computer. The APU's that AMD released for computers in 2018 were just for low end gaming. However, I have read that AMD is going to release new APU's in 2019 and 2020.

Both Microsoft and Sony are planning on releasing next generation consoles either this year or next. Reportedly, these both will have higher performance AMD APU's, but not necessarily the same processor. I don't like the idea of buying a console that has PC hardware that you can't use as a computer. It would seem better to have the same performance in a computer, but so far AMD has not been releasing those chips to the general public.

This video gives an interesting history about how the technology has been evolving:

 
--