Are multi-touch surfaces heading your way?

In the days of black screens and green type, the arrival of colour was somewhat puzzling. If computers had got us so far without colour, who'd want it? Everyone, it seems.

Then came windows, icons, mice and pointers. Again, we were all happy with what we had. Why rewrite everything for some gimmicky whizzbang interface? As soon as you used an Apple Mac, you knew the answer. Ordinary people were suddenly able to do extraordinary things. But it wasn't until 11 years later when Microsoft finally got its act together with Windows 95, that this interface started to become more or less ubiquitous.

And there we've stalled for 26 or 15 years, depending whether you're a Mac or a PC fan. It works. Who wants more? Well, since the time the Macintosh came out, inventors have toiled in labs to bring us a more natural, direct, interface based on fingers, hands and, in the case of horizontal displays, objects placed on the screen. In recent years pioneering companies like Perceptive Pixel, Apple and Microsoft have been selling multi-touch surface devices.

In the abstract, it all sounds jolly fine (apart from the potential for the unselfish sharing of germs). You can access, open, expand, move, rotate and contract information artefacts right there on the screen. They could be images or documents inside the computer. Some of the systems can even interact with other things lying on the screen's surface. The external artefacts might be coded underneath so the system knows what to do with them or they could be simple things like business cards or other documents, which can be scanned. In one case, a library in Delft would whizz up pictorial information about your post code as it read your library card (video here). The Microsoft Surface can recognise and communicate with a suitably enabled mobile phone. It can show the contents of your mobile phone in a notebook. Just slide items to and from the on-screen notebook, in order to update the phone contents.

You could throw a keyboard up or, indeed, a facsimile of any kind of device but the main potential at the moment seems to be exploration, manipulation and mark-up. Fingers are better at some things but certainly not everything. However, if your organisation needs to surface information to any audience, regardless of their computer skills or application knowledge, then this might be a better way to do it than the usual single touch, keyboard or mouse controls.

The Hard Rock Café in Las Vegas has a number of Microsoft Surface tables through which visitors can browse a growing part of the company's collection of rock memorabilia. The National Library of Ireland uses the same product to show rare books and manuscripts which would otherwise be kept from public view due to their fragility or value. The US military uses Perceptive Pixel's huge displays for God-knows-what but you can bet that some of it involves 3-D terrain, flying things and weapons. Then Apple, of course, has made the iPhone exceedingly sexy with its own gestural controls.

While the technolgy and the functions are intriguing and seductive, the question is whether they give sufficient advantage over what's being used today. They cannot replace the present range of control devices except in special application-specific situations. Just as mice and pointers didn't replace keyboards, nor will multi-touch replace current devices. They may complement them though, especially as they become part of the repertoire of the everyday laptop or PC. 

Whenever new technologies come along, it's quite often the user department that takes them on board, side-stepping IT if possible. We saw it with PCs and spreadsheets. We saw it again with desktop publishing. And again with mobile phones and PDAs. But, eventually, either the users or the organisation realise that the greater benefit comes from integration. IT represents the great archive in the sky to which and from which intellectual artefacts can be stored and retrieved. And, once IT is involved, more things become possible; using the mobile phone as a terminal, access to and re-use of materials produced elsewhere in the company and, in the case of multi-touch, delivering the contents of information stores to the devices. Museums and libraries are, perhaps, obvious examples but some users would value a natural way to get at and drill into, say, statistical information by geography or find and explore whatever today's equivalent of a blueprint is.

Right now, you might see these multi-touch surface devices as a bit of a curiosity but, just as the mouse (first publicly demonstrated in 1968) moved into the mainstream eventually, so these things may become important to you and your organisation.

If you're interested, a great place to mug up on the background is Bill Buxton's Multi-Touch overview.

It’s taking longer than I thought

This is just a post to let you know I’m still here. Apart from contractual agreements, I have largely avoided writing stuff while I get my act together. This has meant hundreds of emails and address cleaning/updating. In fact, 555 emails started around 100 conversations, seventeen of which could turn into business relationships, three of them reasonably imminently.

Martin Banks and I have done some training (how to handle the press) in exotic locations – well, Ireland and France, to be precise. It’s still an incredibly satisfying experience and well received by clients. I helped a chum out with editing a mini-book. My talents seem to lie in distilling clarity from complexity. It’s most enjoyable when the subject matter is somewhat ground-breaking.

One fascinating exercise was to analyse the text of 180 or so of the articles, columns and blogs that Freeform Dynamics listed in its sidebar while I was there. I fed this into TagCrowd and it created a word cloud from the 50 main themes. The result is that you can look inside my head:

FDArchive Word Cloud

In terms of social networking, I find Twitter a very serendipitous way of keeping up and I’m tipped off by emails when I’m DMed and an RSS ego search in netvibes tells me if someone’s reaching out. Sometimes it’s impossible to go through the tweets and, despite many attempts, I can’t get on with the ‘swim lane’ approach of TweetDeck. Blogging, you know, I’ve gone quiet on for now. I really only want to write new stuff or new angles – still trying to figure that out. The aim is to reduce noise and deliver value. (Apart from status posts, like this one.) LinkedIn is a great way to find out who cares. I only invite connections from people I actually know and, if they respond, then I feel we can sustain contact by other means. Like Twitter, status updates are serendipitously interesting. For me, Facebook is mostly ‘friends and family’. I don’t use it for business contact at all, although a few business contacts crept in a while back. Finally, SlideShare is a great way to share, not just presentation decks, but slide notes as well.

I have some photos on Flickr, most recently of Dandy the dog, but I’m not into photography that much – I find the lens gets in the way of life itself.

On that note, I think I’ll close this post. Feel free to keep in touch. I will, very soon, burst out of my self-imposed purdah. Have a great 2010 (those who’ve managed to read this far.)

Collaboration and Control

Once upon a time, the boundaries of IT management were fairly straightforward. All your customers were inside the company and exchanging digital information with the outside world was highly controlled, if it happened at all. Not only that, but you sat down and figured out the business needs and then bought or developed the appropriate software which you then ran in-house. The users were obliged to take what they were given. Not quite easy peasy, but close.

Nowadays, users have their own views. They want to collaborate electronically with each other and with the outside worlds of business partners, suppliers and customers. They want to hold webinars, share screens, instant message each other, maybe even work on wikis together and comment on each others' blogs. You have to decide whether to allow these things to happen formally or informally. If formal, at least you have some control over what holes you allow in the firewall. If informal, you've probably given them web access and told them to behave themselves. Although the social media brigade will say, "Trust everyone," only you will know if that's going to work in your organisation.

If you do try to restrict what users can do, you'll be surprised at how inventively they'll sidestep your controls. Research suggests that if they can, they will. You are driven by the need to keep the enterprise system secure. They are driven, usually, by achieving results in the most effective way. These two drivers are not usually compatible.

Knowing that 'collaboration without travel' is at the heart of their needs, you start looking around at what's available. Broadly speaking, the bottom line is a choice between an externally hosted service and one you look after yourself. The externally hosted approach is a bit nerve-wracking because all your company's digital collaborations will be stored on someone else's servers. What if something goes wrong? The service provider could fold or you could simply fall out with it. Can you get all your records back? Will they be in a usable form? This is the stuff nightmares are made of. Some very major vendors are beginning to offer such hosted services. Perhaps you'd feel more comfortable entrusting your data to an IBM, a Citrix Online or a Microsoft, for example.

But the alternative, hosting it all yourself, brings its own problems. Scaling is one, but that's probably fairly easy to address. What about your own users, who are now merrily collaborating with each other, being able to collaborate with external partners of various kinds? Your lock down could end up as a lock-out. And, in these days of close collaboration between organisations, this could be greatly to your detriment.

If partners, suppliers or customers are running different collaboration systems to you (as many will), be wary of the glib salesperson who assures you that interoperability is a piece of cake. Ask to talk to real users with similar needs to your own. Find out if your licence terms allow you to extend membership of your collaboration systems beyond the firewall. Ask a few of your business partners if they would be happy to work in this way. After all, they may be just as nervous about engaging beyond their own firewall.

It's so easy to find private systems that satisfy internal collaboration and security needs. The danger lies in forgetting that, over time, the constituency you serve is increasingly likely to involve ever larger numbers of outsiders.

Lotus knows, but do you?

With the prospect of a conference call with Lotus today, I thought I'd better try (yet again) to get my head around the extensive and, as a non-user, confusing product set. First stop was the IBM/Lotus product pages. Quite a bit of enlightenment, but you sometimes have to drill down unnecessarily to dig out the information. For example, once you get to 'Alloy by IBM and SAP', why doesn't it have a short explanation like the next item, 'Lotus Connector for SAP Solutions', has? I knocked up my own outline so I could see all the information in one place. Sad or what?

Anyway, having kind of refreshed my memory on the product side, I turned my attention to a recent IBM Lotus event called an 'IdeaJam'. This one bore the theme of the company's latest marketing campaign called 'Lotus knows'. The purpose was to get a lot of people from the Lotus community to come up with, and comment on, ideas for getting the brand better known. The discussion was broken into four categories:

Lotus knows working smarter depends on great technology…

Lotus knows marketing is key to technology adoption…

Lotus knows technology is only great with client success…

Lotus knows the world is getting smaller, flatter and smarter…

A terrific idea, except it's a bit like asking a church choir what songs they should be singing. They're going to choose the easy ones, the catchy ones, the ones that appeal to the choir itself. At least, after such an exercise, the vicar will know how to motivate the choir. But whether the choir's choices match the vicar's or the parishioners' needs is another matter.

Still, Lotus' event was very successful by its own standards. Over 20,000 votes were cast on 928 ideas and 2246 comments were made. Ideas could be voted for or against. The top vote (302 net votes) went to putting Notes in more schools worldwide. Second top (209 net) was raising awareness of Lotus among the rest of IBM sales staff. This is astonishing when you consider that IBM bought Lotus fourteen years ago. Talk about hiding its light under a bushel.

Maybe, just maybe, Lotus would have benefitted much more and been able to direct its marketing efforts much more successfully if it were to have run ideajams with IBM mainstream staff and non-Lotus users out there in the real world. The danger is that they might say "Lotus who?" and refuse to participate.

The truth behind the Google/Microsoft/NHS rumours

Before Monday July 6th, did you know that Google and Microsoft had services for storing health records? Thanks to an article in the Times and some related hysteria in other media, just about the whole country discovered that, "David Cameron was going to replace the bloated and expensive NHS computer system with a free one from Google. Or maybe Microsoft."

Except, of course, someone got hold of the wrong end of the stick. Let's face it, whatever we think of the NHS and its evolving computer system, it's not going to be replaced by a packaged service from anyone. Never mind that Google and Microsoft (and maybe BUPA) are supposedly the front runners.

No-one likes overspends on computer projects. And the NHS one due for delivery in 2014 – four years late and at a cost of £12.4bn – presents a wonderful target for the Tories. This seems to have been what caused all the excitement. From £14.2bn to 'free' at the stroke of a pen. Wow!

Who on earth thinks that commercial organisations like Google, Microsoft or BUPA will do anything for free? And who but the most naive will think that moving shedloads of detailed health records from one system to another is going to happen without horrendous cost and risk?

Still, it was a great headline and it, rather unexpectedly, put 'Google Health' in the frame. Whether involved or not, Rachel Whetstone, Google's Vice President, Public Policy and Communications, must be feeling jolly pleased with the outcome. (Incidentally, she's married to Steve Hilton, one of David Cameron's closest advisors. She dropped out of politics after a spell as Michael Howard's chief of staff during his failed election campaign. Oops, wrong horse.)

So what's the reality? The Google (Health) and Microsoft (HealthVault) systems both manage personal health records, or PHRs. They provide somewhere to create, store and share your personal health information and allow you to find related infomation, engage with health professionals and manage your medications. Both put the user in control of content and both are free to the user. This has little to do with the £14.2bn NHS system. At best it would take care of one element of it, the so-called 'Spine' Care Record Service (CRS) but with less information and more restricted access. Medical professionals need access to all manner of detailed information if they're to do their jobs properly and they're simply not going to get that from the personally-filtered subset of a person's medical information that the PHRs represent.

What's on offer smacks of a, "let's get to know your medical issues so we can fire appropriate ads at you". If not, one has to ask what the commercial motivations of Microsoft and Google are. Maybe it's to flog extra services: "Monitor your blood pressure, madam?" or "Remind you to take your pills, sir?"

With the baby boomers reaching retirement age, the market for health-related products and services is exploding. An increasing proportion are computer literate and have their own PCs and internet connections. And nothing is on their minds more than their health. (Okay, maybe their grandchildren and their pets.)

But let's not get carried away by recent newspaper reports. This is not David Cameron single-handedly demolishing the NHS IT budget. Sure, we'd love to enter what the Tories call a "post bureaucratic age", but let's start by getting rid of all the deeply intrusive information that the government already stores about us first.

Will sustainability turn BT Global Services’ fortunes?

The IT or, to give it it's full name, the ICT industry has led a pretty charmed life. After being a participant for over forty three years, it amazes me that it still manages to buck trends; from ever more power at ever lower prices to the potential ability to steer the planet and its occupants from environmental disaster.

At least, that's the hope and the intention of the green IT industry. Manufacturers are gleefully chomping out and selling more and more ICT equipment, while claiming that the environmental savings accruing from its use will mightily offset the environmental harm caused by its manufacture, operation and the disposal of whatever it's replacing.

Of course, IT isn't the only game in town. Cleantech industries are working hard on coming up with new things (with their embedded environmental harm) to reduce our overall environmental impact. It's paradoxical and uncomfortable, but it seems we have to do some more harm in order to do even more good.

One company that has an interesting environmental programme is BT Global Services. It also wants to be seen as "the IT provider of choice". It plans to do this by raising the level at which it consults with businesses by using sustainability as a lens. It has the IT in the form of data centres, software and services. And it has the C, because its core business is communications.

Global Services has posted some ghastly results recently and is in the middle of a restructuring. Perhaps it sees 'sustainability' as an opportunity to improve matters for itself and for the environment.

Anyway, if pretty charts are anything to go by, its Sustainability Practice has a comprehensive approach to helping its customers build sustainable organisations. Like many large companies (IBM, Cisco, CA and HP are just four examples), it has drawn heavily on its own experience to formulate its guidance for customers. For example, an early step in the process is a carbon assessment. This focuses on people, power and procurement.

People commute and travel on business and they use laptops, personal printers and mobile devices, for example. Power is used in office devices and data centre equipment, as well as heating, lighting and cooling. Procurement includes third party services, hosted equipment, print services, transport and so on. These three elements are analysed according to the three 'Scopes' of the Greenhouse Gas Protocol. (Scope I is the direct burning of fossil fuels. Scope II is electricity and the carbon created in using it. Scope III is indirect activity such as staff commuting.)

When you look at it this way, it seems obvious, but that's the deceptive thing about a simple framework.

Of course BT has a range of service offerings to match sustainability needs. And, as you might expect, substituting travel with communications looms large. And 'Homeshoring' is offered as a solution for UK contact centres. (With dog-cancelling microphones, perhaps?) The data centre hosting story is the usual one of greater carbon efficiency than a DIY approach.

The individual elements of the BT story aren't particularly original, but its telephony and networking pedigree hint at good service and security levels. It has many years of implementing sustainability initiatives with resulting business benefits. The savings it boasts sound huge, but these have to be considered in the context of BT's size (£21.4bn turnover last year). It saves £37m per year in travel costs and it saved £238m in one year through conferencing. It also reports a 20 percent productivity improvement from flexible working arrangements.

BT has spent years trying to muscle in on IT's turf. Now the industry really is ICT, perhaps this is the best chance it has. And, with the inevitable build up to December's Climate Change Conference in Copenhagen, now seems to be a very good time for BTGS to set out its sustainability stall.

Virtual events aren’t real events shoved online

As you know, most of us are facing financial difficulties and some of us are becoming concerned about our environmental impacts. Or we may actually find ourselves being pushed in that direction by customer pressure or legislation.

We still like the idea of jetting round the world, or even driving round the country, in order to meet our suppliers, customers and work colleagues. But, faced with the aforementioned issues, we’re increasingly turning to online meetings and events. And, for many, the experience falls short of expectations.

Of the whole panoply of virtual engagements from webinars to telepresence, one type probably sticks out as the most likely to disappoint and that’s the virtual exhibition and conference. And this is probably because we all know what a physical event should be like, so we expect the same or something very similar with the online version.

This is a mistake.

They are not the same and each has its strengths and weaknesses. To ignore this, when planning to present, exhibit or visit, is to invite disappointment.

We are all familiar with the physical event, so perhaps it’s best to focus here on the good and the bad of the virtual. You may have your own views in which case we’d love to hear them.

Primarily, a virtual event (subject to some technical and localisation caveats) is available to all, anywhere in the world. And it involves no travel or accommodation expenses. It will still, of course, take up some of the delegate’s time, but they can generally choose when they want to visit. (The events usually remain online for a while after the initial event closes.) If you visit in real-time, you can probably participate in live Q&As, for example, but you may put a higher value on personal convenience. Because of the social networking tools wrapped round a virtual event, you will still be able to reach out to speakers, exhibitors and fellow delegates as long as the event site remains live.

Exhibitors and speakers also benefit from lower costs, although these are mostly staffing, travel and accommodation savings during the event itself. They still need to prepare and adapt their approach to suit the online world. Making a recorded 90-minute PowerPoint presentation available online is really not taking advantage of the new medium or, indeed, the attention span of an online visitor. Remember that, just as with the web, escape for the visitor is just a mouse click away. In theory, a virtual event should be able to pull together a high calibre of speaker or panellist because of the smaller impact on their time. They would probably be happier to do shorter presentations too if they don’t have to travel thousands of miles for their appearances.

A hierarchical approach to exhibit materials would make sense. Exhibitors could offer a cascade of presentations from short and sweet down to whatever depth they feel is appropriate. And back this up with a menu of downloadable materials such as case studies, product/service information and white papers. This is similar to real life, except that shelf space is infinite, different languages can be accommodated and the materials can include podcasts and movies as well as documents and links to web pages. This self-service approach has the advantage for the delegates that they don’t have to run the gauntlet of the sales team in order to lay their mitts on the collateral. They’ll come back soon enough if they’re interested. And, because they’ve prequalified themselves, their value is much higher than that of the average booth visitor at a physical event.

From the organisers’ and exhibitors’ perspective, they can collect an incredible amount of detailed business intelligence during the event. All the conversations a company has with its visitors and who downloaded what collateral could be captured. At a more anonymous level, all the visits, engagements and downloads made by delegates show the organiser which parts of the event are working well and which are not.

At real events, ‘networking’ is probably claimed as the number one payoff for the delegates. And it’s true that this physical, “look ’em in the eye and shake their hand”, contact is missing from online. This is an incredibly important facet of our everyday lives but, if you can’t afford the time or money to participate in an important event, then a virtual equivalent might be better than no event at all. Having said that, in some respects the virtual event is better because of the ability to check out companies and individuals through the event directories and make appointments to meet them virtually. It is also theoretically possible to stimulate serendipitous meetings by having lounge areas for people to virtually mingle, backed up by on-the-fly created chat rooms if they need privacy. This does, however, miss all those body language cues which tell us whether we want to make contact or not. But some things things we’re just going to have to do without if we’re concerned about our budgets, our time and the environment.

Is telehealth coming at last?

Yesterday at Cisco's C-Scape analyst briefing, we were treated to a
presentation by one James Ferguson. And what a treat that was. Cisco
chose wisely. He was a good speaker, passionate about his subject
(telemedicine, which he prefers to call telehealth) and a medical
practitioner to boot. It was a real person talking about real things,
not some propellor-head from technoland or, worse, a marketeer. This
background, of course, made him a devastatingly effective salesman, and
it wasn't until the Q&A that some of my (Scotch?) mist of
enthusiasm started to clear.

His pitch was essentially
simple. Because the coverage of the Aberdeen-based Scottish Centre for Telehealth (SCT) includes highlands, islands and oil-rigs, it faces
some rather unusual problems. Popping into the local hospital is hardly
convenient. And doctors can't easily get to where they're needed. Not
always in time, anyway. So SCT's been working on getting diagnoses done
remotely in order to a) help people to get the right treatment locally
and b) to identify those who need hands-on professional treatment
urgently. The filtering questions are: "Is this time dependent?"
(urgent), "Is it experience dependent?" (need an expert) and "Is it
facilities dependent?" (need particular facilities).

We saw
people sticking their tongues out and waggling their tonsils in kiosks
while remote experts tried to figure out what's wrong. Apparently
ninety percent of diagnoses can be done by looking at someone,
listening to their chest and looking in their ears, noses and down
their throats. It's a slightly dehumanising way of doing medicine: in the
same way that we all like to meet in person rather than through a
computer screen or over the phone. The truth is, when you're ill and
you're far away from help, anything is better than nothing at all.

Ferguson
was not afraid to mention the dangers of turfing up at hospital. He'd
rather sit on a telepresence or videoconference consultation than face
God-knows-what in person. And patients eliminate the risk of
catching hospital-borne infections if they don't have to go near the
place.

The benefits are piling up.

The downside, of
course, is that this stuff has to be paid for and the bandwidth has to
be there. On payment, Cisco has a cash mountain so this, presumably, is
why it's happy to consider spreading payments over time, essentially
turning the customer's capital expenditure into operating expenditure.
It can still recognise its own revenue at point-of-sale. Although it's
a different issue, we're also seeing gradual acceptance this
pay-as-you-go approach in the various kinds of cloud-based services.

The
harder part of the equation is the communications infrastructure.
Covering highlands, islands and oil-rigs with high quality broadband
connections is a political and economic challenge, given the relatively
sparse populations. Oil rigs have, apparently, been trialling a
satellite-based facility called OPTESS. And some of the ground-based
services have been using ISDN but, of course, the higher the bandwidth
and the further the reach, the more services can be provided remotely.

Ferguson
pointed out that medicine is now so good at patching us up when we get
a major illness, we keep on living only to get more and more illnesses,
until we end up with some chronic condition. All of this puts
increasing demands on an already overstretched health service much of
which, in theory at least, could be alleviated with some kind of home
monitoring and self-treatment service, escalating to the professionals
as and when needed.

But that's to get ahead of ourselves. Right
now, the SCT has run trials inside hospitals running telehealth
'kiosks' in parallel with conventional assessments, in order to compare
the quality of results. (It has a clever way of eliminating bias.) It
is extending this facility to multiple hospitals and has started home
monitoring trials. All of which are testing the principles of
telehealth and capturing feedback from users on the experience.

As
with so many things in the computer world, the big question is whether
it will be able to scale. And that depends largely on either an
appropriate infrastructure or a system which can adapt successfully to
lower bandwidth connections.

Screen and voice recording/publishing for free

Any company that makes life easy for its customers gets my vote. And one company that tries hard to achieve this is Citrix Online. It is driven by a desire to simplify the previously complex. It also likes to undercut the prices of its major competitors.

Right now it has a free service in beta, called GoView which lets anyone create a screencast (voice and screen recording). Since the most popular screencast programs are desktop products, its traditional pricing model – a monthly fee – must have presented a bit of a challenge. So its solution was to go for an ad'-supported model. At the moment all the advertisements are for the company's other services and they don't in any way interfere with your own screencast creations.

True, it lacks the sophistication of Techsmith's Camtasia or Blueberry Software's FlashBack products, for example, but this is largely its point. It's good enough for the majority of existing and potential screencasters. A few clicks and your movie ends up online and you have a URL to share. If you prefer the extra control a desktop application gives you, you might want to check out Techsmith's Jing – a mini-Camtasia and screen capture program or FlashBack Express. Both are free, although the licence terms for Express appear to contradict this.

Returning to the GoView service, once the desktop element is downloaded, a couple of clicks start a three second countdown. Anything you then do on the screen or speak into the microphone gets streamed to the Citrix Online server. When finished, you can edit out the bad bits of the end result, add captions if you want, then share the URL with others. As Aleksandr in comparethemeerkat.com would say, "Seemplz."

GoView is currently in beta and some simple improvements could be made, such as being able to select an area of the screen for recording, rather than the whole screen. But the whole point of a public beta is that the developers get tons of feedback like this quickly and more or less for free. I, and many others, have probably spent hours buggering about with the software and the service. This gives the company a fairly massive free testing resource. The other point of a beta approach is that the service provider is more or less forgiven for flaws. It's how Twitter got so successful. Its 'fail whale' almost became a friend in the early days of the service. I had issues with sound and screen size on Vista at first, but it worked a charm on XP. Once underway, GoView seemed pretty robust.

I think the key to the Citrix approach, and that of many other disrupters, is that it realises that part the world needs sophisticated software and services, but a much larger chunk actually craves a simpler life and lower costs. Professional screencasters will still want 'proper' products which let them massage and publish the outputs in various ways. But regular end users who just want to just grab what they're doing on the screen, twiddle with it a bit, then send it off will be perfectly happy with a simple service which automatically stores the recording online and gives the user a URL which they can share via email, blogs, tweets or whatever. Jing, by the way, comes awfully close in this respect.

GoView is just one of Citrix Online's recent crop of disruptive services. It is taking a pop at the lucrative online education market with a new GoToTraining service. Its fairly new HiDefConferencing offers voice
conferencing which can mix up to 500 PSTN and IP participants together. As with its GoToMeeting and other GoTo products, the terms for both services are based on unlimited usage per licence. This is the computing equivalent of one of those 'all you can eat' buffet lunches so beloved of certain ethnic restaurants.

While I don't care much for concentrating on single companies, it has to be said that Citrix Online is a bit of a one-off. It's a successful business which relies on simplicity and an absence of financial surprises for its customers. The first appeals to end users and the second to everyone.

Not a bad recipe at all.

Dan Bricklin (inventor of PC spreadsheet) on technology

A couple of weeks ago, Wiley asked if I'd like a review copy of Dan Bricklin's 'Bricklin on Technology' book. Normally, I'd say "not on your Nelly" because I know what a chore book reviewing can be. However, I was at the West Coast Computer Faire in March 1980 when Bricklin collected his first award for VisiCalc – the pioneering spreadsheet for the PC. I was also a fairly avid user of his 'Demo' program a few years later. Even though I don't think we met, (unless it was in Zaragoza a couple of years ago), I felt connected, not least because I also developed and published PC software for many years, but without his degree of visibility or success.
When the book arrived, I winced because it's more or less 500 pages long. Unless you're a commuter or you don't get much sleep, how do you find time to read that much?
Anyway, the book was enjoyable at a couple of levels and a disappointment at another. Enjoyable because it peeled off and examined the layers of thinking that went into various products and issues. Bricklin leaves no stone unturned in his pursuit of insight. The transcript of an 85-minute interview with wiki inventor Ward Cunningham is a classic in this respect. (It was 37 pages.) I'd rather Bricklin had identified and pulled out the key elements but then, I suspect, this would have been an editorial step too far for him. He would have had to impose his own interpretations on the conversation, rather than laying it out in full in front of his audience.
You will get insight if you read this book. Insight into what brought us to where we are and a few glimmers into how we might get to where we're going.
The other enjoyable bit for me, which you won't all share, is that I've met (albeit fleetingly) many of the people mentioned in the book, worked with many of the products and written about many of the issues. Bricklin and I even started programming at the same time – early 1966, and we've both tried to take the user perspective in our work. The book triggered many long-dormant memories and reawakened many old feelings, especially in the late 70's/early 80's as we all groped our way through the chaos of the emerging microcomputer/PC business. This is not really a reason for buying the book because Bricklin's chosen subjects seem, in the main, to be serendipitous. A comprehensive history book it is not, although it is a useful addition to the history of the IT world of the late 20th century.
The book is a compilation of old blog posts, essays and transcripts of recordings, loosely arranged around topics which Bricklin finds important, all topped and tailed with narrative from the perspective of 2007/8. As he says in the conclusion, "On any topic you can explore deeply and find nuance", which more or less sets the tone for the book. He does dig deep, he records faithfully and, at times you want him to make his point more quickly. But maybe that's not what he's trying to do. Perhaps he's trying to help the reader understand the nuances, so that they can move forward with their own thinking. I don't know.
Most of his topics have some resonance today, although much of the writing has been overtaken by events or absorbed into the mainstream. The chapters will give you a clue: What Will People Pay For?; The Recording Industry and Copying; Leveraging the Crowd; Cooperation; Blogging and Podcasting; What Tools We Should Be Developing?; Tablet and Gestural Computing; The long term; Historical Information about the PC; Interview with the Inventor of the Wiki; and VisiCalc. It's a ramble round the industry and round the inside of Bricklin's head. His invention of VisiCalc gave him a passport to go where he likes when he likes and meet who he likes. And that's what he's done and, in this book, shared it with us.
My approach, if you're thinking of buying it, would be to say "I'm getting a good 300-page book, I'll just need to pick which 300 of the 500 pages are of most relevance to me." It's a bit like his approach to software – give the user the tools and let them choose how best to use them.

Amazon is selling it in the UK for £10.99