Category: IBM

Mini review: “A Logic Named Joe” by Murray Leinster (From “Machines That Think”, an early 1980s science fiction short story anthology)

logic_named_joe

For a while now I have been reading more about the history of computing (in the USA and in particular Silicon Valley).  It started with the sublime article “The Tinkerings of Robert Noyce” by Tom Wolfe in Esquire magazine and followed by the revelatory – to me at least – Rolling Stone article “SPACEWAR” by Stewart Brand.  Next up was the wonderful book “Troublemakers” which covered Silicon Valley from 1969 to 1984.

The reason I’m interested in this is because the more I find out about the history of computing, the more I realise that the world that we live in today was conceived several decades ago.  Ideas that we think of as modern originated back then.

What they predicted back then, we are enmeshed in today.

The story “A Logic Named Joe” features in “Machines That Think”, a science fiction short story anthology from 1984.

The story is just 17 pages long but I was astounded.  This story from 72 years ago appears to predict the internet, artificial intelligence and some of the less salubrious social consequences of having the world’s information at our fingertips.

The introduction talks about the importance of the story due to the way it predicted widespread ownership of computers, made possible by the reduction in size and cost of the machines.  In fact, the world the story was describing had not yet arrived in 1984 – it was too early to comment on the story and truly understand how predictive it would become.

You can read the story in full and listen to the excellent radio adaptation – both highly recommended.

 

 

 

 

IBM 5 in 5 predictions (2018): blockchain, cryptography, AI and quantum computing

The “5 in 5” is IBM’s annual prediction of five things that will change our lives in the next five years.

If these are correct, the future will be here sooner than you might think…

• IBM: 5 in 5 – Five innovations that will help change our lives within five years

• IBM: Changing the Way the World Works: IBM Research’s “5 in 5”

Mini review: “Turing’s Cathedral: The Origins of the Digital Universe” by George Dyson (audiobook version)

turingcathedral

This is the story of the early development of the computer in the United States, one that is inextricably linked to the creation of the atomic and hydrogen bombs. Despite Turing’s name in the title, he only plays a small role.  Rather, this book concerns John von Neumann and the Institute for Advanced Study – how a group of mathematicians and engineers took Turing’s idea of a Universal machine and made one of the early computers.

I am conflicted about this book.  It is a definitive telling of the story of what happened on the other side of the Atlantic and I do recommend it to anyone interested in the subject.  On the other hand, it can be rather dry and contains some largely unnecessary information.  I’m all for details to make the history come alive, but some passages take you into quite long diversions from the tale being told.  This could put off some readers early on, but I’d advise you to stick with it.  Also, towards the end of the book the author tries to link the early developments with the internet and technology companies of today but he doesn’t do a particularly good job.  To me it seemed redundant.

In relation to the audiobook version, the narrator does as a professional job – another default American male voice.  It can be a little monotonous, but consistent.  He does a good job with some challenging names of people and places.  There are a lot of characters in this book and he wisely does not try to give each person their own voice.

Overall I would recommend this book.  It is not perfect but in general it is a good story well told.  This is one of the rare books where I would recommend you go for the paper version – there can be a lot to digest at points and it would be easier to follow.  At some point I will pick up a copy myself, not to fully re-read but to be able to refer back to.

 

• The Wall Street Journal: The Nucleus of the Digital Age

• The Guardian:  Turing’s Cathedral by George Dyson – review

IBM’s 5 in 5 for 2013: Five innovations that will change our lives in the next five years

Areas covered this year…

• Personalised learning

• Buying locally (mixing online intelligence with brick-and-mortar stores)

• Personalised Cancer treatment

• Digital security

• Future cities

• IBM: The 5 in 5

Article: Microsoft’s Lost Decade (Vanity Fair)

A damning piece about Steve Ballmer’s time as CEO of Microsoft… especially the poisonous environment it has produced within the company.

… more than a decade littered with errors, missed opportunities, and the devolution of one of the industry’s innovators into a “me too” purveyor of other companies’ consumer products.

Microsoft became a high-tech equivalent of a Detroit car-maker, bringing flashier models of the same old thing off of the assembly line even as its competitors upended the world.

How could a company that stands among the most cash-rich in the world, the onetime icon of cool that broke IBM’s iron grip on the computer industry, have stumbled so badly in a race it was winning?

… a mastery of internal politics emerged as key to career success.

…. because of a series of astonishingly foolish management decisions …

They used to point the finger at IBM and laugh… Now they’ve become the thing they despised.

… the Microsoft of old, the nimble player that captured the passions of a generation of techies and software engineers, is dead and gone.

Microsoft failed  repeatedly to jump on emerging technologies because of the company’s fealty to Windows

With the competitors [Apple, Google] showing that kind of success – and winning so many accolades – Ballmer’s confidently proclaimed errors have been hugely embarrassing for Mcrosoft’s technical specialists, fuelling muttered complaints that their C.E.O, a man with little technological background, was undermining them within the techie community.

When he makes these predictions that are so horribly wrong…. it is hard to forgive that, because it means that he is hopelessly out of touch with reality or not listening to the tech staff around him.

Already there are rumblings that the time for him to go could be in the offing.

• Vanity Fair:  Microsoft’s Lost Decade

Also:

• The Critical Path podcast discusses the mixed messages from Microsoft about their Surface tablet  – excellent analysis in the second half:  Management vs. Leadership

• The Guardian: Microsoft reports first public loss after charge for aQuantive writedown (good and bad news for the company – huge writedown, but record revenues)…

Just how amazing is data generation and analysis of one exabyte a day?


Asking the question posed in the title of this post – Just how amazing is data generation and analysis of one exabyte a day? – seems somewhat premature when you think that it is not even possible today efficiently.

The question came up in my mind when I read the following article and associated press release:

• Future telescope array drives development of exabyte processing (Ars Technica)

• IBM’s press release (direct link to PDF)

Here are some quotes:

…the proposed Square Kilometer Array (SKA) of radio telescopes promises to push well beyond the current computing ability of the entire planet

The initial investment… will be used to investigate the types of new processors, power supplies, storage systems, and networking technology necessary to handle the amount of data needed.

…the challenge is not merely to collect and store the data, but to process at least some of it in real time.

While handling data on this scale is possible with current technology simply by brute force, the energy requirement is prohibitive. Thus, the DOME collaboration will investigate new processors, optical networking techniques, and fast storage methods that are energy efficient.

From the press release:

Scientists estimate that the processing power required to operate the telescope will be equal to several millions of today’s fastest computers.

The next generation of large scientific instruments, of which the SKA is a key example, requires a high-performance computing architecture and data transfer links with a capacity that far exceeds current state-of-the-art technology.

This is Big Data Analytics to the extreme.

Only by basing the overall design on architectures that are beyond the current state- of-the-art will it be possible to handle the vast amounts of data produced by the millions of antenna systems of the SKA.

This is a massive undertaking that will benefit all of computing.  However there is one sentence in the article that changed my impression of the project:

The projected date for the start of full operations is 2024.

This prompted me to ask another question: Just how amazing will data generation and analysis of one exabyte a day be in 12 years’ time?

The only credible answer I can give at the moment:  much less remarkable than it is today.