AuthorCharles

Remember how Britain took over the internet in 2000 by getting it all to run on Greenwich Electronic Time? No?

Something on Twitter reminded me of this. This was written for the January 27 2000 edition of The Independent.

BY CHARLES ARTHUR
Technology Editor

Britain’s Greenwich meridian could become the new reference point for time over the Internet, after two rival groups of British businesses resolved their differences over whose measurement they should use.
Greenwich Electronic Time (GeT) will be a powerful brand which could guarantee that companies based in different countries doing business deals could be certain of when they happened.
With more and more time-sensitive data being exchanged – such as online stockbroking and consumer purchases – it is increasingly important to be able to confirm when transactions take place, said James Roper, chief executive of the Interactive Media in Retail Group.
“Who owns a product at what time if you buy it over the Internet?” said Mr Roper. “If you don’t agree about what time it is, you could find that there is a time during which people think they own it – and if both of them then try to sell it you could have real problems.”
By using GeT as a single reference time, confirmed by a network of super-precise clocks around the Internet, Britain would be “at the forefront of Internet development,” said the Government’s newly appointed “e-envoy” Alex Allan, the former British High Commissioner to Australia.
Comparing timestamps of online transactions has already helped to track down fraudsters, said Ian Collins, managing director of Cybersource, which provides the software that powers many e-commerce Web sites. Extending GeT further would help to do that in future, he said.
Yesterday’s launch saw the unification of two factions that had threatened to split the initiative before it started.
The Prime Minister Tony Blair initially launched GeT on January 1 – but it did not then have the essential backing of the London Internet Exchange (Linx), which represents the major Internet service providers in the UK.
Linx, whose offices lie on the Greenwich meridian, had planned to launch its own Greenwich Net Time earlier this month – but was persuaded not to by lobbying from the Government and other industry bodies. Instead the two merged their efforts to produce the single brand.
The Internet already has a network of clocks which are meant to contact each other and confirm their time by connecting to other precision clocks, usually running on “Coordinated Universal Time”, a global standard adopted in 1982.
A key step in promoting the GeT “brand” will be the availability of free software from its Web site at www.get-time.org which will enable businesses and users to ensure that their computers are in tune with GeT, and to timestamp e-mails and Web transactions against them. That software should be available in the next three or four months, said Mr Roper.
//ends

—-

Great idea! (Well, inside the civil service it seemed great. I thought it was a pile of nonsense. After all, you already had UTC, coordinated via atomic clocks over the net.) What could possibly go wrong?

And then in August:

—–

BY CHARLES ARTHUR
Technology Editor

A high-profile scheme launched by Tony Blair in January to make Greenwich the reference point for “Internet time” has run into a dead end. It cannot work with Microsoft’s Web browser, used by the vast majority of Net surfers.
Now, the team behind the “Greenwich Electronic Time” (GeT) initiative are wondering if they will ever be able to persuade people to use their product.
“Overhyped? Er, that would be true and fair I suppose,” said James Roper, chief executive of the Interactive Media in Retail Group (IMRG), one of the scheme’s backers. “We have encountered a nightmare of problems that were so compounded we hardly knew where to start.”
Announcing the plan to create “Greenwich Electronic Time” (GeT), at the start of the year, Mr Blair suggested it would put Britain back at the centre of timekeeping in the new millennium just as the invention of Greenwich Mean Time (GMT) did during the age of sail.
But the reality has proved rather different. The GeT team had suggested in January that within four months they would offer free software for PCs which would be accurate to 0.003 seconds against an existing world standard set by atomic clocks.
Instead, the project only last week produced the first version of its software – and The Independent found that it can display times on the same screen which are out of sync with each other by nine seconds or more.
The problem stems from Microsoft’s Internet Explorer browser, used by more than 80 per cent of Web surfers. Computer code within the program behaves unpredictably, creating the differing time display. But the software giant shows no signs of changing its product to please Mr Blair or the GeT team.
“You would have to ask Microsoft why their version of their own software doesn’t do what their published details say it will,” said Keith Mitchell, executive chairman of the London Internet Exchange (Linx), who is exasperated by the mismatch. “I don’t know why it doesn’t.”
The failure is another embarassment for the Government’s repeatedly proclaimed desire to make Britain an e-commerce capital. Last week the House of Lords passed the Regulation of Investigatory Powers (RIP) Bill, which has been criticised by business and consumer groups for infringinging on civil liberties. A number of Internet companies have said they will relocate outside Britain to avoid the email and communications snooping that the RIP Bill allows.
The flaw in GeT is caused by differences between Microsoft’s version of a computer language called “Java” and the public standard created by Sun Microsystems. Microsoft is being sued by Sun for breaking its licence to use Java in the browser. No resolution is in sight.
The GeT team had hoped that their system – backed by a network of atomic clocks around the Internet – would rapidly become a reference point for all sorts of online transactions. which backs the scheme, suggested last week that it could be used to help people doing online share dealing, gambling and auctions: these, he said, could hinge on messages which would have to be time-stamped to an accuracy of less than a second from a central reference point. The Government’s “e-envoy” Alex Allan, said it would put Britain “at the forefront of Internet development”.
Instead, despite the non-appearance of GeT, electronic commerce has snowballed this year. Online gambling, sharedealing and auctions are all booming, used by millions of users worldwide.
“The world is muddling through,” insisted Mr Mitchell, “but the volume of transactions compared to their potential is still small.”
The same applies to GeT, though: its present network of atomic clocks could handle “tens of thousands” of users, said Linx. That compared with projects like Napster, which has an estimated 20 million people using its software.
The GeT project, meanwhile, was reluctant to publicise the release of the first version of its software in case too many people try to use it: there are fears that the atomic clocks would be unable to cope with a large volume of demands for the time.

—-

Oh God, you have to believe that I was just astonished at how bad that was. And how fundamental the mistakes were.

Still, we don’t have that sort of idiocy any more in the civil service or government. Do we?

Live, PR, live in the 21st century

So there’s lots of people reading my post about the evils of PR done badly.

But who ever suggests how to do it correctly?

Well, here’s a start.

Emails: have a meaningful subject line. Often it’s the only thing the journalist will read before deleting it. Journalists delete lots of emails. Never, ever leave it blank.

DO include the content of what your client insists should be attachments in the body of the email. More and more journalists are reading their emails on the move, so they can’t necessarily view attachments, and won’t set their phones to download them. Text is cheap. Put it in the body of the email. And then tell the client you don’t need to include the 1MB attachment because it’s been dealt with in the 50K text of the file. (It’s just left out the vast logo nobody cares about.)

DON’T send PDFs as attachments. Can’t get the text out cleanly, can’t read them easily.

DON’T include pictures unless they’re the very smallest thumbnails, for the reason just given above: mobile data is an expensive pain.

DO include a link where we can get the entire press release and/or the images for it. We might want to link to it so readers can gasp at your brilliance. Plus it means we don’t need to copy or retype stuff. If it’s embargoed, give a username and password to log in so we can look at it. But set that to expire so everyone can see it in time.

DO, if you’re going to inflict a survey on people (mostly: please don’t) include a link to the original data where the journalist can download it and play about with it. Normal humans might like to do the same.

DO understand that journalists get gazillions of emails every day, plus we’re looking around at blogs, plus we have stuff to do ourselves. We don’t necessarily have time to respond to every one. In fact, we definitely don’t. (See above about deletion.) That followup phone call just gets in the way of us writing a story, linking to your press release, writing our own hard-hitting expose. That’s why journalists are so arsey on the phone. Well, some of them.

DO read my post about how PR and journalism are orthogonal. You don’t ring up McDonalds asking them to fix your car. A lot of PR is getting too mailing-list driven. Know your journalist before you email them.

But most of all do include links. Put this stuff on the web. It’s 2010, not 1995. News organisations have changed. Why hasn’t PR?

DRM? MP3? From the 2001 catalogues: Windows XP won’t be able to create high-quality MP3s

This story was first written for The Independent to appear in its 13 April 2001 edition. $2.50 for every copy of iTunes? One wonders if Apple will ever remove the facility to encode in MP3 from iTunes….

BY CHARLES ARTHUR

Technology Editor

Are you still listening to MP3s? Microsoft wishes you wouldn’t; and so does the record industry – the first because it would rather push its own, proprietary music-digitising format, and the latter because MP3s have, it claims, undermined the business through web sites such as Napster.

Although millions of Internet users have shown themselves to be hooked on the MP3 format, which can turn music tracks into small files that can be swapped and transmitted over the Net, Microsoft said that its next consumer operating system, Windows XP, due out in autumn, will “not include” the ability to produce high-quality MP3s.

That will severely restrict the listening quality of any music turned into an MP3 with that program. Instead, anyone trying to digitise music will be encouraged – not particularly subtly – to use Microsoft’s own “Windows Media Audio” (WMA) format.

Meanwhile RealNetworks [CORR RealNetworks] of Seattle, which was set up by a former Microsoft employee, is also pushing its proprietary RealPlayer format for digitising music.

The intent: to ease computer users to a position where they cannot send each other copies of music without paying for them. Both the Microsoft WMA and RealPlayer formats have “digital rights management” software, with copyright protection built in that will automatically police the use and sharing of music between computers. Only people who can show they have permission to listen to a WMA or RealPlayer file could listen to it on their computer – unlike MP3s, which can be swapped freely.

The WMA format does have the advantage that songs take up less room on disks. But with new technologies providing exponential increases in storage in all formats, that is unlikely to be a burning issue for consumers.

The intent of the two companies to have their own formats used by consumers belies the obvious popularity of MP3s, which are produced under an open standard: anyone can write a software program that will decode them, although software to create MP3s calls for a licence fee payable to the Fraunhofer Institute, which developed the format. That costs $2.50 for every copy of the software produced. For Microsoft, which hopes to sell millions of copies of XP, that could add up.

“We think at the end of the day, consumers don’t really care what format they [record] in,” said Dave Fester, a manager at Microsoft’s Digital Media Division. He said that despite the new restrictions, XP will do “a great job of making sure our player will play back MP3s.” But for new content that users might want to create, he says there “are clear advantages” to not using MP3.

Clear for Microsoft, and also for the record industry, which has been driven to distraction by the success of MP3s, particularly in the form of the Napster file-swapping service, which has allowed tens of millions of people to download literally billions of tracks without paying for them.

That is where consumers and the record industry diverge. “The industry doesn’t want [MP3] pushed, and Microsoft and RealNetworks don’t want it pushed. The consumer is going to eat what he’s given,” said David Farber, former chief technologist at the US’s Federal Communications Commission, who generally opposes the company.

He thinks that XP will be a major weapon in that. “When Microsoft decides to put something in their operating-system support, it becomes the standard,” says Mr. Farber, who against the company during the Microsoft antitrust trial. “The average consumer will use what comes on the disc when he buys the machine. They’re very effective in that way.”

But even those who wish MP3s would disappear allow that that might never happen. “It’s a little like the VHS tape,” said Steve Banfield, general manager at RealNetworks. “DVD is great, but VHS is ubiquitous and it isn’t going away anytime soon.”

–story ends–

Terrible trailer, great film: Le Concert goes against the flow – so when else has a trailer undersold a film?

Observe the trailer above: it tells you pretty much from the outset that this is a comedy – it’s going to be one of those dash-everywhere-oh-my-god-can-they-do-it, rather like the last 10 minutes of Notting Hill (hope that doesn’t ruin it for you).

If you look an early version of the poster (here at Coming Soon) then you get the message straight away: Comic Sans font! Hey, it’s a laugh!

If you went by the trailer, or the Comic Sans font and the rib-nudging tagline, you’d think that The Concert is just a bit of comic nothing – an easy way to pass 90 minutes or so.

No. Completely not. It’s a terrific film which packs a huge emotional punch in a closing section which has no dialogue at all but explains all the loose ends in the story. (There’s a question about whether some of what you see in that section is a flash-forward or just an ambition – I think it’s a flash-forward which, for reasons of keeping the ending tidy, had to be put before the climax).

Don’t just believe me – IMDB, the movies database, is a reliable guide to what people think of a film. And people there give it 7.5/10 (I’d give it higher, personally).

It’s one of the rare examples I’ve seen where the trailer gives you no idea of the emotional power of a film; it makes it look like a silly comedy, but it makes many more points – some of them in comic fashion, sure, but the heart is serious.

It’s unusual, isn’t it, for a trailer to undersell a film? Before seeing The Concert, I saw the trailer for Knight and Day, the Tom Cruise/Cameron Diaz effort – trailer here (no embedding allowed, it seems). But people on IMDB give it 6.6. Every time I’ve seen the trailer, I’ve thought “I’d like to see that film. Looks fun. Cruise not taking himself too seriously.” Apparently not, going by the people who’ve seen it.

So do you know other trailers that have undersold the film? Do tell. Obviously, reference to IMDB to prove that it’s a great film may be needed…

Why Ryanair will not implement – or will withdraw – its toilet charge: because it will cut profits

It is annoying to see the annoying company Ryanair – whose motto I imagine to be “if they’re stupid enough to fly with us, they’re on a mental level with sheep and should be treated as such” – given occasional credibility over ludicrous ideas without anyone asking the straightforward question.

Such as: would implementing that idea actually cost Ryanarse money, or profits?

When Michael O’Leary makes a stupid pronouncement, the media seems happy to repeat it. None seems happy to examine it and throw it back at O’Leary to ask whether he has lost his mind and is trying to annoy his shareholders as well.

For instance: charging people to use the toilet. (That’s a Google search link: the top link at the moment is to an April 2010 story saying that Ryanair is going ahead with it… and the third link is from February 2009, with “pilots aghast at proposal to bring in £1 charge”, which shows you how long this story has been bing-bonging around the mediasphere.

Let’s examine this the way it should be examined: from a business standpoint. If Ryanarse starts charging for access to the toilet, I think it will lose money. Here’s why.

1) emptying the toilet reservoirs (known, charmingly, as the “honey tanks”) is a fixed cost. It’s done at the end of every flight. And the toilets are on aircraft are never in a wonderful state.

If Ryanarse starts charging for the toilet, fewer people will use it. Obviously. It may also have to do more cleanups from parents of young children who run out of money. It’ll also have to get staff to watch over the toilet to make sure people don’t hold doors open for each other – which will be unpopular with the aircrew, since nobody like to be toilet cop.

So it will get a bit of money from people paying to use the toilet, though there will be fewer visits – meaning that the fixed cost, cleaning the toilet reservoir, will only be slightly offset by the takings. And aircrew will have two new grievances: cleanup and toilet cop rota.

But while Ryanarse makes some money from selling toilet access, it will lose money from sales of coffee, tea and other liquids. This is stupid, because it already has the highest prices for coffee and tea and food according to a 2008 survey by Which? Holiday:

The Irish airline charges £2.50 for a bottle of water and £2.50 for a cup of coffee while a small bottle of red wine costs £5.00.

Why will it lose there? Because people will think “Hmm, if I drink this coffee I’ll have to pay for letting it out too.” So the passengers won’t buy the coffee or use the toilet. Ryanarse is suddenly losing money: the profit it used to make on coffee/tea sales. And that is pure profit: apart from heating the water, pretty much everything that it buys for coffee/tea – instant coffee, teabags – can be reused on another flight if it isn’t used. Whereas the toilet reservoirs have to be emptied every time; it is actually more efficient to encourage their use – that way, you get your money’s worth for the cleaning services.

Michael O’Leary – who I think is despicable; if you want to think of the future driven by his credo, imagine Adam Smith’s invisible hand slapping the human face forever – ought to be able to see that charging for access to the toilet is a stupid move, economically. It would actually make better business sense to announce that the “toilet charge” will be rescinded – and raise the price on coffee and tea. In fact, expect it.

And if O’Leary is too stupid to see it, then perhaps his shareholders could show him this blogpost.

And finally, to the business press: next time O’Leary puts forward a stupid idea like this, ask whether it can make business sense. Think about fixed costs and operating costs. And quiz him. When he can see he’s going to lose, he caves in. I think if this is implemented, it will be a money-loser. But you’d need to ask the hard questions – how many drinks are sold per flight before, how many after, what’s the take – to know whether, when Ryanarse announces it’s not implementing (or is withdrawing) these charges, precisely why it’s doing it.

My suggestion: it won’t be because of an outbreak of warmth in O’Leary’s heart, which I imagine as a coal-black thing that would make Lord Voldemort shudder.

Handheld computers: how it looked in September 2000

This piece first appeared in The Independent around September 2000. Given all the talk about some handheld(ish) computer released by some company or other, I thought it might be interesting to look back on…

A couple of notable phrases: “Microsoft’s failure in this market is unusual..” and at the end that “In the long term though functionality is sure to win out over form”. Debate among yourselves whether this was just history talking…

Handheld computers

BY CHARLES ARTHUR

Technology Editor

Handheld computers cannot do what most people want them to. This may seem surprising, given that millions of models using operating systems from Palm, Psion and Microsoft have been sold since 1984, when the British company Psion introduced its first handheld model.

But all are severely limited compared to the expectations placed upon them, which can be traced back to two sources: the 1960s TV series Star Trek, and the hit BBC radio series first broadcast in the 1970s, The Hitchhiker’s Guide To The Galaxy written by Douglas Adams. Only at the turn of the [21st] century does it look like people will soon be able to buy products with the facilities that people have been hankering after for decades.

The sight in the 1960s of William Shatner as Captain Kirk landing on alien planets and flipping out a palm-sized machine which could act as a radio, intelligent locator and general categoriser of knowledge had a subtle effect on the baby boomers’ belief about what computers of the future could and should do. It was voice-activated, and context- and location-sensitive. Similarly, in the radio series, the Hitchhiker’s Guide to the Galaxy was actually the name of a computerised guidebook. It contained as much information as the galactic hitchhiker could need. While its indexing method was hopeless by any standards – the traveller had to look up a number in the index and enter that in order to get the corresponding entry (“so bad it could have been designed by Microsoft,” Adams later quipped) – it did create the belief that someday one could build a handheld machine able to hold all the knowledge not just in the world, but in the galaxy. And if aliens had them, why shouldn’t we?

The reality of the first retail products was rather different. The Psion 1, the brainchild of David Potter, was launched in 1984. It had a mighty 10K of non-volatile memory, an alphabetic keypad and a one-line 16-character LCD screen. Entering data was tedious. It would not have passed muster with Captain Kirk. However its descendants are now widely used by people in jobs requiring simple data collection, notably including traffic wardens.

In August 1993 Apple Computer launched its $700 Newton, which seemed at the time to promise at least some Star Trek functionality. It had handwriting recognition software able to “learn” your specific cursive style; there were promises of wireless communications and word processing.

It turned out to be an example of the computer industry’s occasional hubris. The software did learn your writing style, but often failed to interpret the letters correctly. The Newton was a flop (officially abandoned in 1998, but dead some years earlier) which poisoned the well for entrepreneurs in the US handheld market for some years. Bill Gates of Microsoft reckoned it put the market for such products back by two years. (Probaly an underestimate.) Palm Computing, founded by Jeff Hawkins and Donna Dubinsky in 1992, only managed to survive by selling itself to the modem maker US Robotics early in 1996.

However, in the UK and Europe Psion was thriving, and had developed its Psion Organiser 3 series, which had a miniature keyboard and inbuilt software including limited word processing, a calendar, contacts book and spreadsheet. It looked like a miniaturised version of a laptop computer, and proved very successful in its local market.

But on the west coast of the US, Hawkins and Dubinsky were developing a palm-sized machine which would have some, at least, of the ease of use both of Captain Kirk’s communicator and The Hitchhiker’s Guide. Hawkins envisaged a machine – which later became the Palm series – that would not stand alone, but would synchronise and back up its files with a standard PC. Thus it would not have to do everything; only have enough functionality to be useful while out of touch with the PC.

For data entry, he developed a shorthand cursive system called “Graffiti” which all Palm users have to learn. He tested the ergonomics of the product by carving a block of wood into a size and shape that he could carry comfortably around in his pocket. Function and form thus developed in parallel.

The Palm operating system was hugely popular, even though the basic machine only offered a calendar, address book, task (“To-do”) and notes list, plus a calculator and search system. Its success stemmed from its ability to coordinate with a PC; the openness of the operating system; and the coincidental rise of the Internet. The first point meant users could access their databases more easily than with tiny keyboards; the second that software developers could write programs to enhance the machine; and the third, that those programs could be widely and quickly distributed. Psion, with its EPOC operating system, had attracted some software developers but was held back by its European location (where Internet development lagged by a couple of years compared to the US) and lack of connectivity to PCs.

Launched early in 1996, the first Palm computer sold 1 million units in 18 months. In 1998 Hawkins and Dubinsky left with Ed Colligan, marketing head of Palm: they were dismayed by the slow working of the monolithic 3Com, which had bought US Robotics. They set up their own company Handspring, and licensed the Palm OS, which then had 80 per cent of the world market, served by 100,000 developers, while Psion and Microsoft scrabbled over the remainder.

Microsoft’s failure in this market is unusual, but seems to stem from its WindowsCE operating system (renamed and rebuilt as PocketPC in spring 2000) being too complex for the limited power of the machines. WindowsCE is used in petrol pumps and set-top boxes for decoding digital TV signals.

The future promises rapid change. Until 2000, handheld computers sat apart from mobile phones: an address list on one could not be transferred to another. As usability expert Jakob Nielsen noted, this is absurdly inconvenient. Mobile phones are no good for noting data (such as phone numbers) while you are in a call; but handhelds have been little use for making phone calls.

But Handspring especially has been forcing the pace, as its Visor machines, which use the Palm OS, include a slot called the “Springboard” where the user can plug in items such as a camera, memory module and – from autumn 2000 – a GSM modem.

That abruptly made the Handspring into the potential killer combination of handheld address list and mobile phone. Palm rapidly announced that by the end of 2000, all of its products would have wireless capability. Separately, IBM demonstrated a version of a Palm machine with an add-on board which gave it voice recognition capability, using the ViaVoice technology. Suddenly, the humble handheld was beginning to look like the machine which would be able to do everything.

But mobile phone makers and Psion are not finished. The so-called “third generation” of mobile phones, which will have high-speed data connections, were being designed in 2000, and the Symbian consortium (which uses the Epoc OS) won a contract to provide the OS for a number of phone companies.

What was still unclear at the end of 2000 was whether handheld computers would swallow mobile phones, or vice-versa. The handhelds had the functionality; the mobile phones had the usability. However the mobiles rapidly lost that edge as new WAP (Wireless Applications Protocol) phones attempting to squeeze Internet interactivity into a few lines of a monochrome LCD screen. In some respects, it was a step back to 1984. But the market’s explosive growth may mean that there is room for everyone to survive. In the long term though functionality is sure to win out over form.

end//

How PR fail works. Or fails to work.

Hot on the heels of Kevin Braddock, who posted (and then rescinded) a long list of PRs who had sent him annoying emails, I’ve been noticing a rise in the number of rubbish emails – badly targeted, irrelevant, trivial, stupid – that have been landing in my inbox.

The cause, as we all know, is companies that gather lists of journalists, assign vague labels (“technology”) and then pimp those lists to all sorts of PR companies. Meaning that the puzzled (to begin with) journalists get bombarded with emails about all sorts of “technology” topics, from heavy plant machinery to web apps for diets to which company has won a contract to do the voice and computer networking for Company X. (The latest to annoy me again in this way is Cision, which keeps pimping my email in this way. I really dislike them. I’ve search my very large email repository for emails sent via Cision, and NOT A SINGLE ONE has been useful or relevant. That’s quite a non-achievement.)

This is always done with no regard or interest or even checking as to whether the journalist is interested, or has ever written about this topic. That’s because, of course, it costs the PR nothing to send the email; the annoyed journalists’ wasted time simply doesn’t show up on the balance sheet. (One can make similar points about environmental degradation and the economy, but that might be conflating the trivial and the important.) An economist would tell you that the journalist and the environment both fall into that plain category of “externalities”, aka “I don’t mind, and you don’t matter”.

So here’s how I explained it to a PR person in an email. I’d asked them to stop sending me their irrelevant rubbish. The PR person wrote back with what he thought was a stout defence.

PR person: I sent this release to you on the basis that your readers might be interested in how a company like XXXorganises its [computer] network, despite this type of story not being your main focus.

In other words, what the PR person was saying was this: “Despite the fact that you’ve never written about the topic, haven’t written anything else that looks like that subject, and haven’t written anything about any of the other scores of emails that we’ve sent you. It was just nice and easy and since you didn’t come round to our offices and actually kick us, you must have been really enjoying receiving them.”

Wrong, wrong, wrong and wrong. Should I spam every PR company with requests for interviews with everyone I want to talk to – film stars, rock musicians, top technologists – even if those PR companies don’t handle those sorts of clients or subjects? Should I send out an email every week to every PR person and company in my contacts book saying “Look, I’d really like an interview with Steve Jobs, Jonny Ive, Sergey Brin and Larry Page – can you sort that out?”

No, because it would be a complete waste of time for virtually everyone. But it would be trivially easy – I could set up a computer script that would do it without my interaction. Or I could just put a few different names in each week.

Imagine what it would be like to be in PR: as a recipient, you’d ignore it at first, but if every journalist did it, you might find it wearying. And then you might begin by asking the journalist to stop.

It should be so simple: know the journalist (by reading what they write about), then determine the email that they might be interested in receiving. But the externality problem in PRs and journalists is huge. I’ve written about it before. I just wish that some of the people who send out these pointless emails would stop, but of course it’s the worst ones who ignore it, and it’s the worst practitioners who pimp ever-expanding lists of email addresses. Sturgeon’s Law is alive and well.

Just run that past me again, Professor Negroponte, about the talking doorknobs

From The Independent, November 13 1999:

See, I love it when you can come back to things after ten years. Count the things that have come true.

BY CHARLES ARTHUR
Technology Editor

Doorknobs that talk, computers that you swallow and phones that don’t ring if there’s nobody to answer them will all be reality within 10 years, according to Professor Nicholas Negroponte, director of the world-famous MIT Media Laboratory, and one of the best-known of Internet gurus.
Addressing the theme of how computing will pervade our lives, Professor Negroponte said: “You may wonder about how computing could possibly affect something like a doorknob. But if you think about it, an intelligent doorknob would be a really useful thing.
“You would not need keys: it could identify you by your fingerprints, and perhaps confirm your identity by asking a question – ‘What’s your mother’s maiden name?’ for example. Why would you need keys anymore?”
The smart doorknob could also accept parcel deliveries – and perhaps sign digitally for them; “and maybe it could let the dog out, and then let it back in while keeping out the other nine dogs following it.”
The technology required to do that is already sufficiently miniaturised, he said: such “embedded” systems could surround us. “We will have thousands, perhaps tens of thousands, of embedded chips around us, all intercommunicating,” he predicted to an audience in London.
Professor Negroponte, author of the book “Being Digital”, espouses the view that anything which can be expressed as computer “bits” – such as words, images, video, designs, music – will eventually be transmitted in that form across the world, speeding up transactions and cutting costs. Human activity consists either of manipulating “atoms” – irreducibly physical objects – or “bits”, which contain ideas or symbols. His forecasts have been largely confirmed, especially by the move of music to new digital formats such as MP3 and the rise of electronic commerce.
As computers shrink and become pervasive over the next decade, the sort of information they can access will grow, he forecast. “I you want a really futuristic product for 10 years hence – you’ll have computers that you eat, one per day. It will contain devices and sensors which will record all your anatomical measurements, what’s going on inside you, and relay them to a black box that you wear on your belt. If it passes through you, no problem – swallow another.”
The value of such systems is evident if you consider the problems presently faced by doctors, he said: “Today, you go and say something is wrong, and you tell the doctor a story about how you felt perhaps 12 hours ago, which you can only imprecisely recall. From that, a doctor is meant to make a careful diagnosis and recommend a solution. This may be unfortunate timing after the Egypt Air crash, but I have wondered for a long time: why don’t we have black boxes? Then we could take them to the doctor, and they could read them to see what was wrong with us.”
Professor Negroponte also foresees telephone handsets becoming smarter. “Why do phones ring?” he wondered. “If there’s nobody there, no one will answer. Phones should be built smart enough to know if there’s nobody there. And if there is someone there, they should be able to answer them, like a good butler, and find out who is calling and why, and only then decide whether to get our attention.”
But there are still some giant steps to be made for the average user of computers, he admitted. “Who would have believed, ten years ago, that big segments of the population would spend between £1,000 and £2,000 on their own computers – and that those machines would reduce people to tears once or twice a week?”

I found the hacker… and I’m wondering where else he might be

So finally I had to take Stefan’s advice. Having upgraded to WordPress 2.8.5 over at the Free Our Data blog (where I’ve been having problems with a hacker who’s been inserting spam links invisibly into the end of the page), I …

Oh. And while I was writing that, I noticed – from the FTP transfer that was going on to do a second comparison – that there are a ton of spam pages in the site. Sodding hackers.

Anyway. I downloaded the entire blog content, and then ran a diff – that is, Filemerge (which comes with the Apple Developer Tools, free on your OSX install disk). It compares the content of any set of files, or directories.

Of course the site I’d downloaded was pretty old, and had been upgraded loads of times, so there were loads of files that were on the left (old) and not on the right (new). They just hadn’t been deleted.

Slogging on… I came across a WordPress page which explains which files have been deleted in the move up from 2.7 to 2.8. It’s a useful list and I was working my way through it. Slowly.

By now I’d got to blog/wp-includes/js/tinymce/themes/advanced/skins/o2k7/ and was starting to marvel at how deep WordPress is. When I came across a rather odd one – ui.php – which had the interesting opening:

Codz by angel(4ngel)

Make in China

Web: http://www.4ngel.net

Hmm, is it very likely that a valid WordPress file would really have that sort of comment? And more telling was that when you loaded it in a PHP editor with live PHP generation, you get this:

Yup, it's a hacker's login

Which in essence says: oh, lordy, you’ve been hacked.

Much digging around followed. It’s a fascinating file: it allows the hacker to download your database, and possibly upload chunks as well. I’m going to have to do an SQL dump now to see whether the content of older posts has been hacked (a favourite trick, apparently).

I also discovered a slew of website pages hidden in a directory called “Online” in the “Default” theme folder – which of course every WordPress install will generally have, so that’s a smart place to put it. (That also makes it a good one to delete.)

But as far as I can tell, the site is clean now. My best guess for how they did this is that it was one of the WordPress weaknesses via user registration – this one? This one? There are so many to choose from – and that it’s been sitting there for an age, just waiting to be exploited, or perhaps being exploited and I didn’t spot it. (Certainly neither Google’s indexing nor I discovered the hack of the /default/images folder – which is intriguing. Have you checked that folder lately?)

I hope this is the end of the tale. I’m not pinning everything on it though.

One other point: thanks again to Stefan Pause, who has helped a lot on this (what’s your site, Stefan?) I’m now alerted to the WordPress Exploit Scanner plugin, which will look through your site and find any suspicious CSS, HTML or similar. It reckons that there’s nothing suspicious in the older posts. Good-o, though I’d like to (and will) make sure myself.

Endnote: interestingly, Google won’t allow the ui.php file to be emailed, even in zip form. (I wanted to send it to my web host to explain what I’d found and tell them to search for it.) So obviously Google Mail’s already got some sort of hashing going on to detect malware being passed around. Impressive.

The hackers? Boring, really, that this sort of endless diversion from site to site is how they make their money. All that enthusiasm and knowledge and ability, turned to trying to persuade people lacking self-esteem to buy pills of unknown quality from sites of extremely dubious status. Isn’t there something better we could do with all our time here?

Super-endnote: And then I find another file – this one at /wp-includes/Text/Renderer/Diff/ where there was one called online.php (a bit of a clue by now, because it’s all about “online” crap these guys are selling.)

The WP Exploit Scanner tipped me off – it notes that it’s a base_64 command, which usually means “something to hide”.

And so it proves: here’s the picture you get
Hacker control interface dropped inside WordPress

(You can see the full-size thing at Flickr.) And hey – what is it about hackers and the black backgrounds? Too much watching the Matrix, I think. Forget it, guys – you’re not The One, you’re pushing junk pills.

The hacker leaves more footprints… but how many sites have this problem?

Another day, another little hack on freeourdata.org.uk’s front page – once more adding spam links to it in invisibie links (using the stylesheet command div style="display:none").

What’s interesting this time though is that the person doing it has decided to be a bit more subtle. Rather than doing it all by hand, he’s clearly decided that automation is the thing.

And so the inserted spam-generating code is just one line of PHP. One line!

Ah, but it’s clever – it’s a base64_decode (ie, a string of encoded stuff) which is then enclosed in an eval() statement.

So PHP decodes the base_64 stuff and then does what it’s told by that statement.

And what it’s told to evaluate is to get the content of a URL: http://weberneedle.com/pictures/header/h/freeourdata.org.uk.html.

Weberneedle, in case you’re wondering, is part of Weber Medical. Obviously, it’s been hacked.

The spam is pointing to two directories – http://sportsnation.espn.go.com/fans/Thomas9385 and http://www.anats.org.au/statechapters/act/images/online/canadian/. They’ve been hacked too. (Oh, Anats – the Australian National Association of Teachers of Singing. You’re offering links to a lot more than singing, I’m afraid.)

But it would be interesting to know how many more sites weberneedle’s hacked directory is pointing to.

And the bigger question is: how many sites out there have been hacked? In the course of my experiences alone I’ve come across half a dozen. (And I’m still trying to locate and close the hole in our server that makes this possible, of course. It’s annoying, but not disastrously so.) How many millions (and yes, I mean millions) of sites are there out there which have been exploited in this way, and which are therefore pointing to stuff they never realised?

At some stage there’s going to have to be a massive clearup – but I can’t imagine it happening. You’d pretty much have to turn the web off and on again.