My blog has moved!

You should be automatically redirected in 4 seconds. If not, visit
http://chrisjhorn.wordpress.com
and update your bookmarks.

Wednesday 12 December 2007

Engineers Ireland Innovation Awards

This morning I had the pleasure of giving a brief talk at Engineers Ireland on the occasion of their second annual Innovation Awards, for the most innovative engineer and the most innovative company of 2007. A couple of people asked for a copy of what I said, so for better or worse, here goes…

172 years ago, on the 6th August 1835, the first president of our Institution, Colonel Burgoyne, at our inaugural meeting, said:

“We are engaged in the service of Ireland, and it is our duty, as well as our interest, to promote its prosperity to the fullest”.

In my view, ladies and gentlemen, we – the Engineers of Ireland – could be doing considerably more to promote the prosperity of Ireland to its fullest.

Let me explain. In 2000, our Government had the foresight and commitment to initiate Science Foundation Ireland so as to develop a world class research capability in biotechnology, and information and communications technology, as an essential foundation to our nation’s growth. The discovery of new things, by research, is a commendable activity and may indeed be a foundation for our nation’s growth. It may attract foreign investment, and it may encourage the invention of new appliances, and machines based on insights from the natural world. We celebrate our most successful scientists via SFI, the Royal Irish Academy, the British Telecom Young Scientists Exhibition, and indeed via others.

We also celebrate our entrepreneurs. Ernst and Young, and Ulster Bank, recently hosted a televised evening at CityWest in which our entrepreneurs of 2007 were celebrated. Entrepreneurs organize and manage a business, sometimes taking considerable risk to do so. This year’s winner, Liam Casey of PCH, is a well deserved winner. Having known Liam for some time, I was delighted to sincerely and warmly congratulate him in person on the evening for his business in sourcing manufacturing services from China. But I did reflect at the time on the extent to which entrepreneurship in general, as so ostentatiously celebrated at CityWest and as reflected by the portfolio of finalists chosen by the judges, really benefits our economy. It is wonderful to see the growth of family businesses, and the implementation within Ireland of business models imported from overseas, but it is unclear to me at least whether these examples necessarily lead to a sustainable prosperity for Ireland. We can have successful hospitality businesses, successful reseller and distributorships in Ireland, and successful implementation here of models already proven elsewhere. While all these entrepreneurial activities create employment, it is unclear in general whether they lead to sustainable national prosperity. “Me-too” businesses here in Ireland may enrich some individuals, but in my view at least will not overcome our faltering national competitiveness.

Science is the discovery of what already exists. Entrepreneurship organizes a business. Invention yields new ideas which did not previously exist. But innovation puts new ideas into practice, bringing to life new insights. Joseph Schumpeter, in his Theory of Economic Development (1912/1934), noted that innovation brings new goods, new methods of production, new markets, new sources of raw materials, and new organizational structures into practice.

I believe if we, in Ireland, can innovate and thus put new ideas into practice, then we will benefit from a sustainable national prosperity. If we can bring new products, new processes, new markets, new sources, and new business structures into practice, then we will not only change Ireland but also change the world.

Today we rightly celebrate our innovators, and our innovative companies. They should be the true foundation for promoting our national prosperity to the fullest.

Saturday 10 November 2007

Open Source, China and Microsoft

Confucius was born in the state of Lu. When he received news that the powerful state of Qi was preparing to attack his homeland, he sent his gifted disciple Zi Gong to talk to the rulers of the surrounding states. Zi Gong went first to the state of Qi and observed to the military generals the flaws in a strategy to attack Lu. He succeeded in persuading the generals to first attack the state of Wu instead. Zi Gong subsequently went to Wu and instigated the king of Wu to attack Qi…Thus, Confucius saved Lu.

From ninth chapter of Chang Duan Jing.


At first sight, it would appear that there should be an excellent cultural fit between the open source movement in the West, and Chinese values. Open source emphasizes collective knowledge and sharing of competence. In China, the loyalty to the group is strong; communist philosophy emphasizes sharing, and Confucian teaching emphasizes the latent potential of the individual to attain skilled judgment from the experience of others.

The use of the web is growing fast in China – exceeding the US and growing much faster than the US - as reported in Forbes. Open source collaboration uses the web as a collaboration platform, so you would also expect this to add to the momentum of open source in China.

The Economic Intelligence Unit recently reported a ratio of 100 jobs for every computer science graduate in China, with this number expected to sky-rocket; and Duke University reported 60,000 computer science graduates from 4-year degree programmes in China in 2004, and 292,000 from 3-year programmes. This huge domestic demand, and huge output, of software developers might lead you to expect further momentum for open source in China.

A recent Eclipse Members meeting noted that China has the most number of the downloads globally (over a recent 18 month period) at 21%, followed by the US at 18%, and both Germany and Japan at 8% each.

So: what is the status of open source in China ?

Well, there are a small number of open source projects in China, but apparently not as many as you might expect. XOOPS (a content management framework) is quite well known and Stephen Walli’s blog contains an interesting presentation by Tiaiwen Jiang, the community leader in China, on the project from the Chinese perspective. Huihoo is a leading open source middleware project including a J2EE implementation, JFox. Qianqian at Harvard Medical School initiated, in 2004, a collaborative project Wen Quan Yi to develop an open source font set for the 70,000 Han characters encoded by Unicode.

An interesting development is the merger of ObjectWeb in France and Orientware in China at the end of 2006, to form OW2. They are sharing open source contributions in a variety of middleware technologies and their deliberations are documented in their Board of Directors minutes...

But, perhaps predictably, the main interest in open source in China is Linux: just google ‘china open source’ and you’ll see! In 2003, China enacted the Software Government Procurement Regulation (SGPR) which excluded foreign companies from the federal software market. As recently as the end of 2005, CIO Magazine was discussing China’s federal commitment to Linux – for multiple reasons at the time, including suspicions of US intelligence agencies “trojan horses” in US proprietary code (remember the B-767 government jet delivered in 2002 to Beijing, but with eavesdropping devices discovered on delivery ?); overcoming WTO IP concerns by promoting open source; localization to the Chinese market; and kick-starting a strong domestic software industry.

But then things changed in 2005, despite CIO Magazine’s analysis above: in trade talks, China laid aside the SGPR in favour of concessions on industries such as textile and colour television.

So now, according to Lou Shouqun of the China OSS Promotion Unit (a non-government organisation) in a recent presentation, the Linux revenues in all China last year (2006) were about 218M RMB (20MЄ), with a market share (by revenue) of just 3%. Other UNIX systems were 52% - the financial services and telecommunications industries in China have heavily used Solaris, AIX and HP/UX, amongst others. Windows was 42%. While the overall market is growing about 10% per annum, Shouqun believes Linux in China is growing faster than the market, albeit from a low base.

Things have changed even more significantly since the 2005 abandonment of the SGPR directive: Microsoft seem to have successfully found favour with the federal authorities. Fortune magazine documented Bill Gates recent summer visit to China; the history of Microsoft in China, and how Microsoft has very successfully wooed the Chinese policy makers creating an apparent “win-win” situation. It is a fascinating article, and I recommend it to you if you haven’t already read it.

So, why are there few committers in China, and apparently meek participation in the global open source, particularly when the number of software professionals in China is rising so rapidly ? Why is it that Windows is far more successful in China than Linux, and is Microsoft’s new strategy truly a “win-win” ?

The urgency to make money is IMHO a national obsession in China. Consumerism, and chasing Western fashions and brands are all-consuming. You must remember that, within the life times of those of us in our 40s or more, very many Chinese were incredibly impoverished to appallingly abysmal standards of life. It really is only in the last twenty years or so that national living standards have consistently dramatically improved – while admitting of course that there remain many challenges and disparities today across the huge country. In my own experience, making money is far more important to many Chinese than political discourse.

If your parents and grandparents have supported you, a single child, through your professional education as a software developer, their expectations (and needs in their senior age) will be that you will support them. Your partner’s parents and grandparents will have similar expectations. An engineer, including a software engineer, is considered a respected professional: many policy makers, senior business managers, and senior party members also have engineering backgrounds. As a software engineer, your own, your family’s, and society’s expectations are all that you will be financially successful.

Can you be truly financially successful in China if you are an open source developer ?

In the West, much of the open source activity in fact is carried out within companies, including Intel, Novell, IBM, Sun and Oracle, amongst others. Foreign companies in China with recognized global brands – such as Microsoft, with its very wealthy founder – are highly attractive as employers, since they not only in general pay well by local standards but also potentially open the possibility of international travel.

But, which foreign companies have so far established software development laboratories in China which contribute to open source development ? IBM have a Linux technology centre in Beijing. Intel announced in 2004 development centres in Beijing, Xi’an, and Guandong to help Chinese companies develop desktop applications for Linux. Oracle promotes its products on Linux in China, including via the Oracle technology centre in Beijing. But these centers appear, on the surface, to be solution centers which promote Linux based application solutions perhaps as a response to the apparent promotion of Linux by the Chinese authorities and prior to the SGPR retraction; rather than development centers actively contributing to globally available open source. I of course am very open to correction, but it would appear that to date only Novell has opened an R&D centre in China specifically for Linux system development. It is also noteworthy that IONA (of which I’m Vice-Chairman) has Chinese committers from its Beijing R&D centre on the Eclipse STP and Apache CXF open source projects.

For domestic Chinese activity, the largest player in open source Linux is Red Flag. It is ambitious to develop into an international player in Asian Linux, and has recently announced a specific initiative. However its current sales revenue is still relatively small, even by Chinese national standards, at just 40M RMB (3.6MЄ).

To the extent that open source development is being conducted in China by foreign companies, with Chinese committers, then arguably these initiatives need to be more openly promoted and publicized to the Chinese software development community. The open source movement in China needs major foreign brand name companies to visibly invest and recruit in China for development of open source.

Well, then how about Chinese open source start-ups ? Are there any budding MySQL ABs, SugarCRMs, or xTuples ? Yes, and I mentioned Huihoo and Red Flag earlier, as examples. But IMHO the open source industry in China is currently fragile and considered so both by potential employees (ie software developers) and, as importantly, Chinese corporate customers. It is frankly easier – and perhaps more socially acceptable with one’s parents – to work for an established organization, particularly if it is a foreign brand.


Let’s now look at that second question I posed above: why is Microsoft now being much more successful in China ? Microsoft seems to have overcome concerns by Chinese policy makers by pro-actively taking a number of steps. The “trojan horse” threat has been overcome by allowing access (and hence inspection) under appropriate conditions to Microsoft source code, and China now has a federal laboratory to do exactly so. Microsoft has been very actively investing in the education sector, including rural classrooms and software engineering universities, and so aligning its investments with the federal desire to strengthen software skills nationwide – Microsoft is training 1,000 instructors and 20,000 software engineers, and offering online courses to another 50,000 engineers. It has worked with the federal authorities in the context of WTO obligations to ensure that more Chinese PCs have legally licensed pre-installed copies of Windows. In turn this is overcoming piracy issues since the pre-installed versions are more current, have less bugs and more features than older copies of Windows. It has also dramatically dropped the price of Windows in China. In summary, President Hu Jintao on a visit to Microsoft said that Bill Gates is a friend to China and the Chinese people: in China, this is an incredible endorsement, and it is difficult to find any analogy in the West for such an important and powerful public ratification.

It would appear that Microsoft is on a roll in China. It is very interesting to reflect on the Microsoft strategy and their execution of it within China, and contrast that to the open source industry and initiatives in China. Of which open source project or company will the President of China publicly endorse as a friend of China and of the Chinese people ? If the open source industry globally is to benefit from China’s rapid development, it is clear that investment – perhaps akin to Microsoft’s commitment to China – will be needed.

It is also very interesting to ponder to what extent the prices Windows customers in the West are paying are being used to subsidise Windows customers in the East. Cross-subsidies are of course not at all a new idea in any industry: I find the Microsoft case interesting because it is part of a much broader initiative.


Let me finish by a hypothesis for policy from the Chinese Government perspective. The war is not about whether Windows or Linux will ultimately win. Both are sufficiently low cost here in China to be usable. Microsoft is generously up-skilling the national software engineering talent pool, and the open source industry is also helping by publishing its source code and inner workings. The war is rather about building a vibrant software industry in China, capable not only of satisfying national needs, but also exporting and becoming a world leader.

The main requirement is excellent application development, on whatever foundation systems and middleware technology are de facto in the global industry (it doesn’t matter which, as long as they are low cost in China). Open source by the Chinese industry – and for the Chinese industry - could play a very significant role indeed. By fostering a national repository of re-useable Chinese application components – with documentation and test suites – written by Chinese developers (in Chinese first, and then maybe English), with a framework put in place and re-enforced by Government policy and investment, then the national industry could be rapidly enhanced.


“Therefore the Master concerns himself with the depths and not the surface, with the fruit and not the flower….When his work is done, the people say ‘Amazing: we did it, all by ourselves!’’”

The Tao Te Ching, by Lao Tzu


This was the basis for an invited keynote I was to give at the OS Summit in Hong Kong at the end of this month: but the conference has now been postponed until sometime in 2008.

Monday 5 November 2007

Think Liquidity.

Professional investors understand liquidity. They understand asset backed securities, and they understand the risks when assets subsequently emerge to be poorer quality than they were represented to be. Sub-prime assets can be embarrassingly illiquid and career changing.

In the world of enterprise IT investments, customers have likewise yearned for liquidity. Lock-in to assets available solely from any single vendor implies significant risk to the purchaser. Bad investment decisions into IT assets which subsequently emerge to be poorer quality than they were represented to be, can be embarrassing and career changing – particularly if the assets are illiquid and difficult to replace.

Financial markets have been driven by liquidity. However by contrast, purchasers of IT assets have found it challenging to be able to subsequently replace and substitute alternatives when desirable.

Until now.

Today, I believe that the software and hardware industries are fundamentally changing in favour of liquidity. Software is increasingly componentized. Software has increasingly recognized industry standards which facilitates substitution of alternatives. Open source provides liquidity through lowering the cost of change.

Enterprise software vendors should be trusted partners in providing and maintaining tailored solutions. The ability to successfully integrate a variety of components from a variety of sources to an enterprise level of service is valued. The ability to scalably manage multiple configurations, and evolve them dynamically over time, is valued. Tailorable, personalized solutions for specific customers, partners and staff, but all as part as of the holistic enterprise, are valued. Dynamic systems enable liquidity amongst software assets – no matter from which particular vendors specific assets are obtained.

New, and sustainable, business models are emerging from software vendors who deeply understand technology liquidity.

Single, monolithic, vertically integrated silos of software stacks is thinking from the last century. Integrated stacks are illiquid if any specific components or layers cannot be readily substituted by better alternatives on the market today, or which may emerge tomorrow, from any vendor.

It does not take an oracle to foresee what will happen if BEAS are purchased by ORCL. Overlapping products – portal servers, application servers, service busses, Java development platforms, whatever – will be culled: “synergies” throughout the two organizations will be executed. ORCL will attempt to cross sell its own offerings into the BEAS client base and migrate them away to ORCL alternatives.

Professional investors understand liquidity. Even if they are new to investing in IT equities, they therefore should have little difficulty in understanding that enterprise IT customers likewise yearn for liquidity of technology assets. In the past, enterprise vendors have been slow to offer liquidity. Now, the IT industry is changing fast, and the potential upside for investors in IT is vendors who understand and are executing on technology liquidity: vertically integrated illiquid stacks are from a former and sub-prime era.

Think liquidity.

Tuesday 30 October 2007

The Long Tail Wags

What kicked of this blog entry for me were some reflections I had as I chaired a one day conference, with a preceding half day workshop, a couple of weeks ago, which was organized in Galway by Fidelity Investments, on the subject of Web 2.0, and how commercial organizations – such as Fidelity – can benefit from and add value to the global internet community.

A core theme of Web 2.0 is the collective wisdom that results from the network effects of sharing across the community. The wisdom of the crowd is sometimes more than the sum of the individuals therein. Wikipedia is one prime example, since collective knowledge can trigger insights for individuals, which can then augment the collective wisdom: the added knowledge which would not arise if the group did not collaborate together. I recently wrote a blog entry about a similar phenomenon in multi-disciplinary research. Group and individual reasoning can positively feed off each other.

One of the dichotomies of the Web 2.0 phenomenon is on one hand, the power of the group and the collective knowledge of the crowd; and yet on the other the significance of each individual. Yes, collective knowledge such as Wikipedia and del.icio.us (and of course Google!) have emerged, but blogs written and commented by individuals are still a fundamental force. Some may argue that indeed the information in specific blogs written by certain individuals is more valuable, accurate and reliable than that in shared wiki repositories maintained by the amorphous net community. There is room for both individuals and the crowd.

In the race to be successful on the web, the focus is on attracting and retaining users, eyeballs and clicks. For the promoters of a web site - whether a commercial venture, or just a worthy cause for society at large - building and growing a “market” is key. Monitoring usage patterns, listening and reacting to user feedback, all builds momentum: hopefully a tipping point is passed, and the network effect of market momentum reinforces the popularity and acceptance. However: does a mass market strategy play only to the crowd, and not to the individuals therein ?

For Web 2.0 , Chris Anderson introduced the term “long tail” to emphasise deploying:

“customer-self service and algorithmic data management to reach out to the entire web, to the edges and not just the center, to the long tail and not just the head”.

As noted by an Amazon employee quoted by Wikipedia: We sold more books today that didn't sell at all yesterday, than we sold today of all the books that did sell yesterday” -- read that slowly to yourself again if you hadn’t come across it before..

Further, in his discussion on technology market dynamics, Christensen noted:

Simply put, when the best firms succeeded, they did so because they listened responsively to their customers and invested aggressively in the technology, products, and manufacturing capabilities that satisfied their customers' next-generation needs. But, paradoxically, when the best firms subsequently failed, it was for the same reasons--they listened responsively to their customers and invested aggressively in the technology, products, and manufacturing capabilities that satisfied their customers' next-generation needs ….But the problem established firms seem unable to confront successfully is that of downward vision and mobility, in terms of the trajectory map….”

Success – at least in some industries, such as the disk drive and earth moving machinery industries which Christensen documents – can equally lay the foundation for failure. Group and individual behaviour can play off each other, creating emerging forces from below.

Combining the Anderson’s exhortation to reach out to the edge, with the warning by Christensen not to be outflanked by emerging disruptive technologies from below, it would seem wise to ensure that a Web 2.0 market strategy explicitly recognises the potential of the individual, as well as the mass market of the group. Even more important is to react to dynamics in the market and to changing tastes. The observation is: that in a global market of mass consumerism, the ability to cater for the vast number of changing individual personal tastes and desires – to personalise and dynamically tailor your products and services – may be much more commercially – and socially – valuable, than volume plays of standard products and services to an anonymous amorphous group.

The long tail changes: it wags. The universe of micro-markets in the long tail is worth addressing – this is Anderson’s observation. In addition, changes in a micro-market may in due course influence the mass market (and blind side you) – this is in essence Christensen’s observation.

Ajit Jaokar was one of the contributors at the Fidelity event in Galway, and he reminded the audience of Tim O’Reilly’s characterization of the Web 2.0 phenomenon:

  • the web as a platform;
  • harnessing collective intelligence;
  • data as the next Intel inside;
  • the end of the software release cycle;
  • software above the level of a single device; and
  • rich user experiences.

I thought about each of these six facets as I reflected on the dynamics of the micro-markets, and list some (not all!) of my observations below: how does Web 2.0 address the wagging long tail ?

One common way of reaching out to the long tail, and monitoring its movements, is “data as the next Intel inside”. Google exploits what its itself calls the ”uniquely democratic nature of the web” to derive data on the click activity of millions of users to drive its PageRank algorithm. I suspect that social networking sites, such as MySpace, Facebook, Bebo and LinkedIn, could likewise exploit their data about networks of people. I’ve always wondered in passing how data regulators, such as the Irish Data Protection Commissioner, view such click gathering and social networking activities…

As Web 2.0 embraces beyond “the level of a single device” – which was Ajit’s theme of his presentation - I observe that telecommunications operators, such as Vodafone, O2 and T-Mobile, collect and record quantities of data, ironically in a large part due to regulatory requirements of data retention legislation in jurisdictions such as Ireland. These vast data archives capture about social communication patterns of populations – it is not just the social networking sites which have such data! There are opportunities, and privacy risks, to exploit these vast patterns to provide commercial value. One can imagine intelligence being generated by mobile phone operators to enable targeted marketing by third parties and content providers…

Quietly collecting mouse click activity, or analyzing phone call patterns and triangularising the locations of their owners, are data driven ways of using to drive for the long tail. Importantly, the data is dynamic and therefore so also is the derived collective intelligence: as trends emerge, or change, so can the offerings tailored for particular sets of individuals. But it seems to me that these approaches may wear a clandestine cloak: there are more explicit ways of using collective intelligence for the wagging long tail.

A successful web site needs stickiness to retain, and track, its audience, including the wagging long tail. A social networking site such as those above in part generates its stickiness because of the network effect of having ones friends, co-workers and colleagues using the same site: most people think it is too cumbersome to re-register oneself and encourage one’s cohort elsewhere. IMHO, the resulting stickiness is in fact rather superficial, rather than systemic: it consequently threatens any purported fiscal value for the site. The SIOC project at DERI in Galway underlines the benefits, and business consequences, of seamless federation of online communities. Stickiness should not be driven by re-registration inertia, but rather by community value and impact - which is one reason why I believe Ammado will succeed.

Another strategy to capture the long tail wags is “harnessing collective intelligence” – another of O’Reilly’s Web 2.0 facets. One explicit way is to allow individuals to define their own collections of interesting items, and then to share and discuss them with other like-minded souls. A simple way to do this is just by tagging items: for example, photos tagged with “Galway” on Flickr will appeal to certain people. But tagging rather quickly loses its impact when trying to find items with a combination of tags, requiring tedious trials of various search criteria; and tag synonyms are a problem in a global world: try “football” in del.icio.us as an example – or even “Galway Hooker” in Flickr!

Rather than expecting people to work out the correct search and query expression across a set of tag values for what they actually want, another way is to let people explicitly define and share their own collections. Hobbyists – such as stamp collectors – have been doing this for years, and a craft worker likewise learns what the set of right tools are for the job from the experiences of his trade. Enabling individuals – and micro-markets and even the crowd - to share what works well together when, is a positive tactic for managing the wagging long tail, and which I wrote about recently in the context of software configurations and Cloudsmith.

When doing a little research for this blog entry, I came across Dan Bricklin’s interesting blog entry on When the Long Tail Wags The Dog. His theme is that “must have” items have more value than those which are less likely to fit the job at hand; and that general purpose items are likely to have most value since they can entertain the dog as well as its long tail. While I don’t disagree, my thoughts about the dynamics of the long tail, and how to work with within it, are a slightly different emphasis.

Reinforcing Bricklin, being relevant to the individuals and small groups in the long tail requires customization and tailoring of general purpose offerings. In the enterprise software space, software vendors have traditionally worked with systems integrators to tailor more appropriate solutions to different niches of the market. With componentization in the software industry, in principle useful aggregations of different constituent parts can more likely be used to tailor specific solutions. Collective intelligence can be harnessed by explicitly sharing these aggregations, as I indicated above.

However, once delivered, installed and put into use, assemblies of software components in production environments have in the past usually been relatively static. The challenge is that the long tail wags: micro-markets and even mass markets change, and production systems need to more easily do so too. One promise of the dynamic module environment of OSGi is to enable dynamic evolution: if collective intelligence can be dynamic as trends emerge, or change, and then suggest new offerings tailored for particular sets of individuals, then perhaps also can production software assemblies in business and enterprise environments be dynamically adjusted. This is a theme which we are working upon in IONA, particularly in the context of our highly dynamic plug-in architecture for Artix and our open source FUSE offerings – see Eric Newcomer’s blog.

But if the world is dynamic, and contains a universe of micro-markets as well as a mass, can offerings not only be tailored so as to be a good fit, but also be priced attractively for each ? What when micro-markets change ? As I wrote about, one industry which faces these challenges is the mobile/cell phone operators, for whom intelligent and responsive bundling of services (SMS and MMS and call rates and roaming charges etc) is competitively critical. The ability to generalise this approach into the world of software components and packages is of interest and relevance to the wagging long tail. LeCayla is one company building a common approach to the issue of dynamically rolling out new charge and billing rates so that software vendors can competitively foster specific micro-markets.


Let me summarise: the internet as a platform enables a global market to be addressed. Web 2.0 uses collective intelligence to play in both the mass markets, and the long tail of numerous micro-markets. There is room for both individuals and the crowd, and group and individual reasoning can positively feed off each other. – creating the ever lurking possibility of being disrupted from below. Scalably addressing both the mass market and the long tail is however not the complete issue: markets change, the long tail wags, and scalably addressing dynamic markets is even harder – but possible.


Footnote: my 7 month old Alsatian, Charlie, has yet to grow into his long long tail. He wags it a lot.

Friday 26 October 2007

News; Cloudsmith, Science Gallery, UNICEF and Ammado.

Some news snippets from some of the things I'm involved with:

Cloudsmith

An upgrade to the web site at www.cloudsmith.com – more features, particularly so that you can easily map your own repository, better documentation. The site is generating considerable interest in both the open source community, and some commercial firms seeking interesting new ways to distribute and support their software.

Note that the next major release of the Eclipse development environment – the Ganymede release - will be built using Buckminster: the team behind Buckminster also have built Cloudsmith, as I discussed in some earlier posts..



Science Gallery

Opening in Feburary next, with the LIGHTWAVE festival and showcase, and getting some good coverage eg in Nature and the Irish Times recently. Watch the blog at www.sciencegallery.org for further news and announcements.



UNICEF

A great new video at http://www.youtube.com/user/UNICEFIreland - featuring Irish children with Irish celebrities Andrea Corr, Colin Farrell, Marian Finucane, Stephen Rea and Ryan Tubridy. Our UNICEF colleagues in the US Fund produced a similar (US centric) video last year, and the Irish National Committee have borrowed their idea, with permission.

“Every single child deserves to live”.



Ammado
Their web site is now live at www.ammado.com I first mentioned Ammado and their CEO Anna Kupka earlier this year. Actually, I'm not directly involved in Ammado, but know the senior team and are very excited by what they're doing.

An entirely new way of thinking about social networking is imminent...


"Creating heros.."


Monday 10 September 2007

Sharing the best and most valuable: Software Hot Rodding

Like some other teenagers of my generation in the seventies, one of my hobbies at the time was faithfully constructing scale models from plastic kits from Airfix, Revell, Historex etc. As I developed my skills and interest, I took great pride in adding extra levels of detail – particularly for ship kits. Eventually I converted particular kits so as to model an aircraft, ship or vehicle not directly available as a standard kit, by moulding balsa, crafting acetate sheet and so on – for example building a twin engine, triple tail fin Avro Manchester from the Airfix kit of the successor four engine Lancaster. The monthly Airfix Magazine was a great source of designs and examples, and I still have a shelf full of back copies here in my office at home.

When I was an engineering student, I took a similar interest in building my own, albeit fairly simple, analogue electronic circuits – the usual things such as oscillators, radios, fire/smoke detectors, etc. Of course, as with converting plastic kits, having a good stock of spare parts - scavenged from broken electronic circuits – was useful. I used to keep them sorted into empty margarine tubs on my shelves.

In more recent years, I became more and more involved in computers, from the digital gates and circuits up to microcode, assembler, compiler/interpreter systems and finally full systems and applications. I was fascinated when they became affordable as home systems, and played around adding extension cards and interfaces to my first PC. I was initially really impressed by Dell when I saw their range of alternative pre-configured systems for direct order across the internet. Dell used to have a plant in Bray just a few miles away from my home, and I have bought many systems over the years from them for home, school and charitable use.

But I’ve often thought there must be quite a few people out there like me who aren’t quite happy with a specific system they buy from Dell (or indeed any other supplier). Although I carefully chose a configuration before buying, still afterwards I frequently tailor my purchase further with additional cards, expansions and options. I also wonder whether somebody out there has already built the perfect system for creative digital photography, or multi-screen home media management, or whatever. Or whether anyone would be interested if I could publish my own designs and configurations. Or whether any configurations would be commercially interesting, for example for specialist systems such as creative digital media design.

Reflecting on my fads above, the common aspects are a desire to build interesting variations of basic designs; from additional parts either built myself, or stock-piled somewhere (not necessarily in margarine tubs!), or purchased elsewhere; to look at designs published by others, and perhaps even to publish my own.

Well, that’s what Cloudsmith is doing for the world of software. One way to think of Cloudsmith is as a derivation of Dell’s world, for software. You can browse an online catalogue at Cloudsmith and see what “distros” – configurations - of software components are available. If you want to chose any particular one, you click it and it “materializes” – ie downloads its parts, and then automatically assembles them together – onto your machine. And like getting a system from Dell, the different components of your configuration will usually come from different “repos” (repositories - think Dell sub-suppliers) around the world. So rather than just a single download of a binary file, a materialization will usually automatically fetch multiple files, from multiple places, and assemble them automatically for you on your machine as a complete system. The distro is thus “virtual”: it is not monolithic (like an old download) and its parts are not hosted at Cloudsmith (any internet repo can contribute).

If you like what you find, you can send others the “Cloudlink” you used. When they click it, the same components will materialize from the cloud of components available in multiple repos across the internet, and assemble and install on their machine too. So, you can materialize something specific without having to yourself connect to the Cloudsmith site and without having to search the online catalogue for it – just get the Cloudlink for it from someone else via an email or blog or whatever, and the magic will happen when you click.

Whats kinda nice about the Redhats of this world is that they pre-assemble a large collection of software components for you as a full Linux system. Whats nice about Cloudsmith is that if you build your own interesting set of software components, then you can publish that configuration at Cloudsmith, and reliably make it available for other people to use. It need not of course be as complex as a full operating system: just a nice application or tool or subsystem for a particular use which you have put together, using your own design or as an interesting derivation from somebody else's published distro. You can publish novel and interesting things that you’ve done, for others to take a look at and possibly use – no longer are you or they tied just to the pre-assembled varieties which the software equivalents of Dell provide. And if you have components from a new repo which Cloudsmith does not yet know about, you yourself can simply add that repo to Cloudsmith’s map of the world – it is the Cloudsmith software equivalent to introducing Dell to a new sub-supplier.

Why hasn’t a service like Cloudsmith existed up to now ? One reason is that although there has been a proliferation of useful re-useable software components developed and published around the world in many online repos, there has equally been a proliferation of software version control, make and build systems. It is difficult to justify re-engineering everything globally to use the same version control and build technology. IMHO it is impossible to impose one build or version control technology, or one IDE, on everyone: and it is short-sighted to expect everyone else to use the one that you happen to think is best. However, with some thought and help, it is possible for systems like Cloudsmith to automatically, on the fly, interpret the meta-information in all these version control and build system technologies, and thus automatically derive a global perspective and capability across many different repos.

But how do you know that somebody else’s published configuration is any good, and trustworthy? Well how does a plastic modeler or an electronics hobbyist know that somebody else’s design is any good ? Part of the answer is the popularity of the design – how many others are using it, and what do they say about it – and part of it is the reputation of the designer. Social community sites on the internet in general work because most people are ethical, and many people can watch, observe, comment and mend where necessary information which is incorrect.

Assembling and publishing new configurations of components is fun. Is it however just for hobbyists and software hot-rodding ? I believe that many commercial organizations will find interest in discovering and contributing to software configurations and assemblies that add real value in their business domains. I think enterprise managers will find reassurance not only in the popularity of certain distros, but also because they can be reassured that precisely the same distro (and bill of materials) is reliably installed on every machine under their control, and because component updates can be notified and controlled. I think vendor product managers and technology strategists will find interest in the concept of being able to build their own private “cloudspaces” for their licensed customer communities, to manage and distribute software in a controllable way.

Now, let me put my hand up and confess that all of this isn’t yet as smooth as we would like at Cloudsmith just right at the moment: it is an alpha version for the community to experiment with and give us feedback on. The materialization wizard, the wizard to build a cloudlink are IMHO both pretty good and straightforward to use. Private and public cloudspaces are ready to use. But today, the publishing wizard and the wizard to register a new repo are IMHO a bit clunky: we’re improving them right now and we expect better versions within a couple of weeks. Our online documentation is being improved. However, if you’re interested, contact us and we’ll talk you through online and get you involved in the Cloudsmith community.

Saturday 8 September 2007

Academic and industrial research: multi-disciplines are best..

In the 70s I was a postgraduate student at TCD Computer Science, and interested in digital satellite networking and local area networking technologies. I spent some time studying the design of the Cambridge Ring and other token ring networks. I was then completely inspired by the emergence of CSMSA/CD Ethernet algorithm (in turn derived from the earlier ALOHAnet radio broadcast network) whereby instead of carefully synchronized access to the shared medium, senders are optimistically allowed to transmit whenever they wish: if interference occurs on the medium because of simultaneous transmission, then a randomized backoff procedure allows the system to recover.

I was reminded of all this when I was chatting last week to Lawrence Cowsar, head of Bell Labs Ireland and CTO of the CTVR initiative (I'm chair at CTVR). Sometimes interesting systems result from relaxing constraints and encouraging established taboos to be challenged: for Ethernet, the taboo of careful synchronized access was replaced by unconstrained transmission initiation.

Lawrence, over dinner, was bemoaning the fact that academic research funding is usually and inevitably given for highly focused, highly specialist, uni-disciplinary research. As a result, although the majority of postgraduate researchers, PhDs and post-doctorates may have some experience in team work, that team work is only within the constraints of their own particular research area, under the guidance in most cases of an appropriately myopic faculty member.

By contrast, in general in industry, and in Bell Labs as one example, researchers pro-actively participate in inter-disciplinary teams. This in general seems to be in complete contrast to most academic research. One reason this seems to be is that in general it is difficult to find funding agencies willing to invest in real inter-disciplinary projects, for which multiple research laboratories and multiple terminologies and backgrounds are involved – in general proposals for such research are peer reviewed by research specialists who in turn emphasize uni-disciplinary and singular focus as the best way that their own particular state of the art can be advanced.

I was also reminded of the inertia towards “interference” of disciplines and research when chatting to Lawrence about the technology and algorithms of co-channel management for mobile phone (cellular phone, if you are US) networks. In today’s mobile networks, transmitter masts are carefully erected across a city or landscape so as to minimize interference between masts which reuse the same radio frequencies. This co-channel management challenge is usually resolved by map-colouring algorithms, which allocate frequencies to masts during the network design so as to minimize interference. Sometimes, I smiled to myself, do research proposers seeking financial support from various funding agencies use a map colouring algorithm to minimize the overlaps between their work and so optimize their chances of funding ?

CTVR, and the other CSETs funded by SFI, really are inter-disciplinary: multiple laboratories, academic and industrial, with a very large set of skills and backgrounds. Terminology and a common ability to communicate and collaborate were definitely a problem in the early stage of CTVR. However, we are now seeing highly interesting results which IMHO could only have emerged specifically because of the inter-disciplinary nature of CTVR.

One example is collaboration between the thermal (heat) management research team in the University of Limerick, the radio frequency (RF) and dynamic spectrum research team in NUI Maynooth, the software radio team in TCD, and finally the constraint and optimization analysts at UCC (a full list of CTVR partners is here). It turns out that the taboo of pair-wise interference minimization for co-channel algorithms for mobile (cellular) phone networks can be broken. In fact, channels need not be carefully separated, and interference can be allowed. Using insights from “temperature interference” in thermal management systems, instead of pair-wise constraints between transmitters, a single constraint can be applied to all transmitters whose signals can still be reliably discerned by a receiver. Channels can then be allocated in mobile networks in such a way that has the commercial benefit of reducing the numbers of transmission masts and equipment required.

Who would have thought that research into heat management would lead to an interesting new way of allocating channel frequencies in a mobile phone network ?

Another example of multi-disciplinary work is the application of well understood results in the radio frequency and dynamic spectrum field to the relatively younger area of photonics and management of light, for next generation 100Gigabit Ethernet – work being led by Tyndall at UCC, but for which the NUI Maynooth RF engineers have been able to bring remarkable insight.

In my view – and I believe I speak for Lawrence too – some of the ablest and best researchers which industry can hire come from those postgraduates and post-doctorates who have truly experienced and innovated in inter-disciplinary research. Taboos can sometimes be best challenged and overcome by researchers from outside the immediate specialist domain in question. And yet, world-wide, much academic research seems structurally resistant to such ways of working.

Monday 20 August 2007

Cloudsmith goes live!

As some of you may know, I regularly holiday just outside Roundstone in Connemara. I’ve just come back to Dublin yesterday after some time there again.

If you haven’t yet been to the west of Ireland, I think one of the most striking things is the web of small stone walls that embrace the fields, pastures, meadows, boglands and tracts. Each is made by hand, and almost always as dry stone walls without mortar. They usually are as a result of clearing granite stones and rubble from the fields, and are economic: not requiring mortar, they do not suffer from frost attack, and so little maintenance is needed. At first sight, they all appear similar, but in fact there are different construction styles, with single, double and combination walls as the basic classification. They are malleable: walls can be easily moved and re-configured – gates are not strictly necessary since a few stones can easily be removed and put back again to, for example, let cattle through. Patrick McAfee’s book and website are a very readable commentary.

From ground level, the profusion of little stone walls can appear as a complex pattern, perhaps even fractal-like. But viewed from a high point, perspective reveals the logic of the landscape and the paths – the boreens – lined with walls either side, gently meandering to distant places.

I came back to Dublin last night, and this morning read Martin Banks’, of Reg Developer, excellent overview of Buckminster. Since I first wrote about Buckminster, the team has considerably improved the tool, including the documentation kit. Martin’s article makes use of a bricklaying analogy, and I guess I hinted in my own blog entry that if you are to build structures from re-usable bricks, then it might be useful to have a web site somewhere at which various designs could be published found and compared…

Well, also while I was away in Connemara, www.cloudsmith.com went live. Many of those on the Buckminster team have collaborated to put the site together, and the initial incarnation of the site certainly turned out to have richer functionality than I myself expected in a first iteration. There is a fairly detailed overview on the About Cloudsmith page, but in summary:

  • Cloudsmith keeps meta-data about assemblies of software components.
  • Software components can be sourced from any number of public and private repositories worldwide. Of course, components from private repositories are only available to those duly authorized to use them.
  • Cloudsmith does not store the components themselves: but it knows where they are worldwide and how it can access them in the appropriate repository formats.
  • A software publisher – an individual, project, or company – can register one or more specific software component assemblies with Cloudsmith.
  • A software consumer – an individual, project, or company – can search and browse for available assemblies; and can readily download and install any particular one – “materialize” in Cloudsmith-speak – onto his local machine (or indeed another machine if appropriately authorized).

In effect, Cloudsmith is building a global map of software components (in various forms: source, binary, versioned, and optionally with test suites, documentation and license agreements). Professional software developers - individually or in a community project or working on a commercial offering – can publish interesting new assemblies of components, sourced across one or more repositories.

One of the neatest capabilities of Cloudsmith is a Cloudlink. A Cloudlink is simply a URL: it can be sent in an email, or given in a blog or whatever. When a Cloudlink is clicked, the software assembly which it denotes is then materialized without further intervention, onto the local machine. This gives a very simple download mechanism: publish a Cloudlink, and anyone clicking on it within a recent-vintage web browser can download your software. In practice, when a Cloudlink is clicked, behind the scenes the Cloudsmith site is contacted, and it resolves the differences between the assembly of software components identified by the Cloudlink, and those already available on the local machine, and then fetches (as appropriate from various repositories worldwide) and downloads the missing components.

Cloudlinking in turn enables “virtual distributions”. A software publisher can create a virtual distro, whose components reside across multiple (eg open source) projects and repositories: materializing a virtual distro requires nothing more than a web browser.

If your project is looking for a simple way to make its software available to the worldwide community; if your project is itself using software from multiple sources and multiple projects; if you want to keep your community regularly updated with patches and extensions; if you want to manage installation and distribution processes; then Cloudsmith should be worth taking a look.

Software components, and configurations and assemblies of them, are very malleable. It is relatively easy to define new interesting configurations, as well as new components. Looking at the world wide activity, and the multitude of repositories and projects, it is easy to become overwhelmed. It is possible to detect patterns, and different styles of construction, but sometimes it can be very confusing to see overall themes, to understand how other people are using configurations, and what changes have occurred.

I’m reminded of Connemara’s dry stone walls. They are numerous, wonderful, simple, easily changed, easily re-built, easily maintained, and as a result have lasted for decades. But in the landscape and up close, they are confusing to absorb and see the overall picture. The perspective of height gives clarity.

Cloudsmith is giving clarity to the construction of assemblies of software components.