I came to this audiobook with high expectations. I have listened to audiobooks of three of Robert Harris’ earlier works (Pompeii, and the books about Cicero in ancient Rome, Imperium and Lustrum) and have been very impressed each time. The narrators have always been excellent and this book is no exception. The narrator here is David Rintoul, who did brilliant work on The Day Of The Jackal. His measured delivery and the fact that he is comfortable dealing with snippets of French benefit this story immensely.
As for the content itself, this is a very solid piece of work from Harris. It is compulsive reading if not necessarily thrilling.
On a side note, the title of the book itself is also open for interpretation: are we talking about the two principal characters of the story – one an “officer”, the other a “spy”? Or is the narrator both? Or the alledged victim? Do both men qualify for each role?
On a second side note – come on Mr Harris, where is the third part of the Cicero trilogy?
Stephen Wolfram is giddy with excitement. A long post, but worth a read all the way through.
I’m not sure what is coming, but whatever it is, I am starting to formulate an answer for that inevitable and challenging moment when my daughters ask me why they should bother to learn mathematics.
In fact, with (potentially) big steps forward like this laying a foundation now, maths teaching is going to be revolutionised and the above question will no longer be relevant. After learning the basics, advances of this kind will allow people to see the beauty of mathematics.
The young will grow up with things like this. In the same way they are all “digital natives” now, they will be the first generation to be “computation native” too.
They will grow up appreciating mathematics in a real-world context much more than most people of my (or any previous) generation.
Then they can truly change the world.
• Stephen Wolfram: Something Very Big Is Coming: Our Most Important Technology Project Yet
…recently something amazing has happened. We’ve figured out how to take all these threads, and all the technology we’ve built, to create something at a whole different level. The power of what is emerging continues to surprise me. But already I think it’s clear that it’s going to be profoundly important in the technological world, and beyond.
At some level it’s a vast unified web of technology that builds on what we’ve created over the past quarter century. At some level it’s an intellectual structure that actualizes a new computational view of the world. And at some level it’s a practical system and framework that’s going to be a fount of incredibly useful new services and products.
It’s hard to foresee the ultimate consequences of what we’re doing. But the beginning is to provide a way to inject sophisticated computation and knowledge into everything—and to make it universally accessible to humans, programs and machines, in a way that lets all of them interact at a vastly richer and higher level than ever before.
A crucial building block of all this is what we’re calling the Wolfram Language.
We call it the Wolfram Language because it is a language. But it’s a new and different kind of language. It’s a general-purpose knowledge-based language. That covers all forms of computing, in a new way.
There are plenty of existing general-purpose computer languages. But their vision is very different—and in a sense much more modest—than the Wolfram Language.
And so in the Wolfram Language, built right into the language, are capabilities for laying out graphs or doing image processing or creating user interfaces or whatever. Inside there’s a giant web of algorithms—by far the largest ever assembled, and many invented by us. And there are then thousands of carefully designed functions set up to use these algorithms to perform operations as automatically as possible.
So in a sense inside the Wolfram Language we have a whole computable model of the world.
It can be an array of data. Or a piece of graphics. Or an algebraic formula. Or a network. Or a time series. Or a geographic location. Or a user interface. Or a document. Or a piece of code. All of these are just symbolic expressions which can be combined or manipulated in a very uniform way.
In most languages there’s a sharp distinction between programs, and data, and the output of programs. Not so in the Wolfram Language. It’s all completely fluid. Data becomes algorithmic. Algorithms become data. There’s no distinction needed between code and data. And everything becomes both intrinsically scriptable, and intrinsically interactive. And there’s both a new level of interoperability, and a new level of modularity.
And this is not just a theoretical idea. Thanks to endless layers of software engineering that we’ve done over the years—and lots of automation—it’s absolutely practical, and spectacular. The Wolfram Language can immediately describe its own deployment. Whether it’s creating an instant API, or putting up an interactive web page, or creating a mobile app, or collecting data from a network of embedded programs.
And what’s more, it can do it transparently across desktop, cloud, mobile, enterprise and embedded systems.
It’s been quite an amazing thing seeing this all start to work. And being able to create tiny programs that deploy computation across different systems in ways one had never imagined before.
There’ll be the Wolfram Data Science Platform, that allows one to connect to all sorts of data sources, then use the kind of automation seen in Wolfram|Alpha Pro, then pick out and modify Wolfram Language programs to do data science—and then use CDF to set up reports to generate automatically, on a schedule, through an API, or whatever.
And with our Wolfram Embedded Computation Platform, we’ll have the Wolfram Language running on all sorts of embedded systems, communicating with devices, as well as with the cloud and so on.
I’m very excited about all the things that are becoming possible. As the Wolfram Language gets deployed in all these different places, we’re increasingly going to be able to have a uniform symbolic representation for everything. Computation. Knowledge. Content. Interfaces. Infrastructure.
Just as the lines between data, content and code blur, so too will the lines between programming and mere input. Everything will become instantly programmable—by a very wide range of people, either by using the Wolfram Language directly, or by using free-form natural language.
After my last Open University course (Analysing Data) ended in June, I gave myself the summer off. I read a number of books and enjoyed the time that was previously taken by study with my family. I decided that the time demands of another OU course (along with work commitments) would be too much at the moment.
But that does not mean that I can’t carry on towards my goal of moving into the field of data science – I just need to approach it at my own pace.
There are a few new books that are starting to get to grips with what is a new and largely undefined subject, especially “Big Data“, “Doing Data Science” and “Learning R” . I feel that people can now become a lot more informed about what is involved. There is now more flesh on the bones. From here on in I’ll know what is actually involved and what is needed to get there. The more I do and the more I find out, the more challenging it seems. Daunting even. But I firmly believe this is a foundation for the future and I fully intend to take part, even if there is a long way to go.
So, what’s next? These are things that can be done concurrently. I intend on doing more practical work.
• Continue to learn Python – I’m getting to grips with the basics
• Read ”Doing Data Science” – I’ve started, and I think it will be a real education on what is really involved
That should keep me busy for a while…
99% Invisible is one of my favourite podcasts. It is currently running a Kickstarter campaign to fund a new season of weekly shows. It has raised the original amount it needed but now needs to hit 10,000 backers to get an additional $20,000 from a sponsor.
This is the first Kickstarter project I’ve backed, and I look forward to all the great episodes coming my way. You can donate from $1 – it will be the best 62p you’ve ever spent (that wouldn’t even buy you an iPhone app…).
Please back the project.
The bottom line first: if you are interested in a broad and non technical introduction to the subject of “Big Data” then you should read this book. It is short and highlights a number of points (some that aren’t necessarily clear from reading elsewhere.)
Importantly in the first chapter it says that to be practising “big data” projects you do not have to be dealing with millions of data points. There may be a lot less but the issue is that you should be working will all the data that is available to you rather than just a sample. With all the data, it is possible to analyze it in different ways. With just a sample you will likely be limited to what you can discover after the sample has been taken. The authors discuss the very first article I read about this subject, Wired’s The End of Theory. It’s very interesting to read how the article is now regarded.
People may have to get used to the data revealing what is happening without actually revealing why it is happening. In some areas we will have to let go somewhat of the (natural) desire to understand the reasons behind the results.
The authors deal with the subject of data getting “messier” (becoming more imprecise) as as you increase the amount you are collecting:
However in many new situations that are cropping up today allowing for imprecision – for messiness – may be a positive feature not a shortcoming. It is a tradeoff. In return for relaxing the standards of allowable errors, one can get a hold of much more data. It isn’t just that “more trumps some” but that, in fact, sometimes “more trumps better”.
Because this data set consists of more data points, it offers far greater value that likely offsets its messiness.
Big Data transforms figures into something more probabilistic than precise.
So more trumps less. And sometimes more trumps smarter.
“Simple models and a lot of data trump more elaborate models based on less data.” (quote from Peter Norvig, Google)
… treating data as something imperfect and imprecise lets us make superior forecasts and thus understand out world better
The chapter on “Datafication” of just about everything is a good balance of history and the insights that can be gleamed from today’s social media giants. Location is particularly important:
The point is that these indirect uses of location data have nothing to do with the routine of mobile communications, the purpose for which the information was initially generated. Rather, once location is datafied new uses crop up and new value can be created.
Datafication is only just starting, but now it is under way it will continue, with many benefits:
Once the world has been datafied, the potential uses of the information are basically limited only by one’s ingenuity.
Seeing the world as information, as oceans of data that can be explored at ever greater breadth and depth offers us a perspective on reality that we did not have before.
Another important point is that humans will have to get used to the fact that their opinion is not always the best:
… the biggest impact of big data will be that data-driven decisions are poised to augment or overrule human judgement.
This is likely to mean a change in the requirements needed to do a specific job. The importance of experience will diminish as insight from data can dwarf the experience of one person.
Mathematics and statistics, perhaps with a sprinkle of programming and network science, will be as foundational to the modern workplace as numeracy was a century ago and literacy before that.
… the winners will be found among large and small firms, squeezing out the mass in the middle.
Big data squeezes the middle of an industry, pushing firms to be very large, or small and quick, or dead.
Re-use of data is looked at – old data can be combined with new in different ways to discover or exploit new opportunities. So what is the value of data? A company may have relatively few assets but a massive company valuation – therefore is the difference between the two the value of the data the company controls? That could mean billions of pounds / dollars / etc.
A number of times there were names of sites or companies that led me to put the book down, check out a website or install an app. The chapter called “Implications” is particularly good for that, but it does slow down the reading somewhat. Even when a book is this recent some of the examples are now out-of-date (for example, Decide.com shutting its doors as its staff join ebay). This is a fast moving field.
There is a lot more to this book, impressive given that it is only 200 pages long. I’m glad I read this book – it puts so much into focus.
The latest Die Hard film (AKA “Let’s blow the shit out of Moscow”) is not good. Here is a mini review of the original that appeared on my second website over a decade ago:
As action films go, this still has not been surpassed.
The story is a basic one, and one you probably know – a group of terrorists break up a Christmas party so that they can steal over $640m of negotiable bonds. John McClane (Bruce Willis), a New York cop, takes them on from the inside.
So, what makes this film so good? The script is a belter – dark humour bristles throughout. The performances are first class – Willis is perfect for the role, Alan Rickman rules supreme. Rickman’s head terrorist is fiercely intelligent, informed, vicious, heartless. What else could you want?
“Glass? Who gives a shit about glass?!”
For me, the biggest difference though is that McClane is not a hero. Bullets do not bounce off him, or land an inch from his feet. He suffers. By the end he looks like death warmed up. Making him barefoot is just a little detail in a script, but adds so much to the film.
“Five million terrorists in the world and I have to kill one with feet smaller than my sister”
This film also has a lot to answer for – even over a decade later, films are still using this as a reference. For a while every action flick was “Die Hard on a ….” boat, train, plane, whatever. Die Hard 2 takes some justification, too.
“I was always kind of partial to Roy Rogers, actually”
Also, this is the film that started the spiral in actors’ pay to the point that A-list stars now command $20m a throw. Willis was paid $3m – a big wage in those days; he was just a TV actor trying to break into film. His first film had flopped, so where was the justification for his pay-day? And off it went…
So you think you’ve seen it? Try it again – this is the best actioner around.