BPM Futures


IBM Cloud steals Lombardi thunder

Another IBM Agility seminar at the Shangri-La Hotel, and some BPM announcements. And in contrast with the sunny spring skies warming Sydney’s harbour (for those of you in the northern hemisphere :cool: ) the best bit in here was the cloud.

But first …. Websphere Lombardi Edition is to have drag and drop integration with both FileNet P8 Content Server and Content Manager 8. The extent of the functionality involved wasn’t clear to me – presumably IBM will start with search/retrieval and later move on to others like metadata update and new document insertion? Anyway, further integration will be with Websphere Service Registry and Repository – useful for orchestration purposes – and with iLog, where it will be possible to browse and select an existing Ruleset on a predefined iLog JRules Execution Server.

In the meantime Websphere iLog itself is to be coupled with Websphere Business Events to become Webspher Decision Server, extending IBM’s business events capability, whilst the iLog BRMS SupportPac is to provide Websphere Business Monitoring and predictive analytics integration

All very worthy, but much less interesting than the next piece of news, which was the launch of Blueworks Live. This combines three elements – the Blueworks BPM collaboration community (blogs, wikis); the highly successful (Lombardi) Blueprint process discovery and definition environment; and a new workflow execution engine. All running in the Cloud and, apparently, available through your browser for a test drive from November 20th. (Yes, that’s this Saturday – perhaps one of the software world’s most specific launch dates ever…!).

Now, Cloud-based BPM is hardly new. Cordys was one of the first to offer it globally, and there are niche players too, such as Australian company OpenSoft, which uses open source products to provide integrated Cloud-based BPM to the burgeoning Australian energy and resources sectors. However, Cloud-based BPM from IBM is something else entirely. IBM’s existing mindshare in the global BPM market and its credibility as a corporate Cloud (and FM) provider mean that the interest in this product will be enormous, and as a result it could well be a game-changer for all BPM stakeholders.

The PowerPoint-based demo that followed included a marketing manager setting up a new process for her latest marketing initiative. Yes, that’s one process for one case/process instance. And if the Powerpoint is to be believed, it only took her a few minutes.

How can this fail? The CIO’s happy because it’s SaaS; the Board because it’s IBM; the Ops Manager is comfortable because its running in an IBM Datacentre; the process improvement people have Blueprint to play with; the IT teams can focus on integrated, production BPM system work; and best of all the Business can replace its endless email trails with easy to access, auditable business processes.

So what next? Well, here’s a prediction –  Blueworks Live will do for business processes what Microsoft Sharepoint did for enterprise content – it will get everywhere. That means a step change in awareness regarding BPM (how many business – or even IT – people knew of ECM before Sharepoint?) and huge opportunities for BPM professionals to sort out all of those ‘home grown’ processes. Bring it on!


Pegasystems Sydney Symposium

I’m running behind with my blogging. It’s now several weeks since the Pegasystems Business Process Symposium took place here in Sydney, however whilst not quite ‘hot off the press’ the event is easily worth reporting on, even now, for its excellence at three levels – case studies, product and philosophy.

Pega’s philosophy – or at least my understanding of it – puts top priority on ease of use for both developers and end users. This means plenty of functionality that is easy to put together into processes, and thereafter just as easy to maintain. This is a big ask – business processes tend to be complex, and the technology set required to support them is fairly broad – and can only be achieved through a pretty stubborn focus by the vendor.

This philosophy came across quite graphically in a Q&A session towards the end of the day. Alan Trefler, CEO and founder of the company, was asked why Pega wasn’t providing more extended support for custom Java user interface development. Now 9 out of 10 company representatives put in this position would have (a) spoken at length about the support that was already in place and (b) at least implied that further and even more exciting developments were on their way. Not Mr Trefler. He told the questioner that custom Java code was far too slow to develop to be useful in BPM deployments – instead, it was the responsibility of BPM vendors to provide a UI builder, fully integrated with the core product, that was fit for purpose. The Pega roadmap? It would continue to improve the built-in Pega UI builder …. and if any customer or prospect felt that there was functionality lacking in it, he would be delighted to make the investment necessary to develop the product further.

Now that’s focus. I have been responsible as a manager – and, going back a few years, as a developer – for BPM implementations with both flavours of UI, native and custom built (ie Java/.Net). From a productivity point of view the native (BPM) UI wins hands-down, both because it is simpler to use and because a single developer can define both the process flow and the accompanying screens together. There is no need for an interface, two sets of data definitions and, worst of all, two different developers each with a slightly different skillset and understanding of the requirements. The native UI has only one catch – without real commitment from the vendor, the UI builder tends to have significant functional gaps. Close those gaps and you have a winner.

On a different topic, he was asked about the rationale for the Chordiant takeover. The answer was interesting in that it emphasised Chordiant’s core differentiator, its predictive and adaptive capabilities, which support more intelligent management of (eg) customer retention, cross-selling and fraud processes. Applying this technology to end-to-end processes, rather than simply the CRM front end, has the potential for significant value-add.

It is perhaps this combination of a practical, experience-based development focus with innovation where it can really make a business impact – rather than simply following the latest technology trend – that explains why Pega tends to have rather interesting case studies. On this occasion it was Mike Efron, eBusiness Manager from Wesfarmers Insurance who spoke about using Pega to provide a rules- and process-based consumer portal through which Kmart Tyre & Auto Service is selling white-labelled personal lines insurance products. The key here was ‘building for change’ – Pega’s slogan, which this project realised through defining specifically those aspects of the solution that were not required to change – and then leaving it to the system’s designers and the system itself to ensure that everything else could change. He told the audience that once Kmart Tyres was safely live, it took the team just two weeks to change the system sufficiently to support a second ‘white label’ customer.

A second case study that was mentioned at the event was British Airports Authority. This is the sort of innovative case study that refreshes one’s interest in BPM. How many BPM solutions have as their primary input channel not email, not scanned mail … but radar? Rather than my re-writing it, check out Gartner’s take on it here.

The final topic is of course the latest product news. This is well-documented on the Pega site, and the highlights for me were:
– A new Case Management version of the product with a slick user interface and a process architecture that includes effectively unlimited nesting of cases. So a motor claim can include separate sub-processes for vehicle repair and personal injury; the personal injury claims can include separate processes for the several individuals involved, each with multiple different types of injury, and so on. All neatly tied together into the Case Manager’s desktop.
– Other Case Management features include ad hoc tasks, delegation, support for multiple parties and related cases, correspondence management and reporting.
– New Process Designer features that are used for Process Discovery. These are similar to those introduced by a number of other vendors in recent years with the important addition of requirements traceability. I understand this is made available as a cloud service to the Pega Developer Network.
– Project management tools (eg for task, risk and issue management, and including wiki and twitter-like functionality) that use Pega core technology and can be configured to fit the desired SDLC approach (waterfall, agile etc). This looks well-developed enough to use, though the overlap with third party systems is obvious. It’ll be interesting to see how this area develops.

Overall this was an excellent event, showcasing a product that is increasingly differentiating itself from its peers, and was much enhanced by the presence of the CEO himself in Sydney.


Let data make that process dance

I was recently discussing BPM agility with a senior IT manager, when the topic of configuration files came up. “Many of my colleagues don’t like them” he said “we have hundreds of them, and they don’t like changing them because they don’t know what the consequences will be”.

It’s easy to smile at this, but practically all guardians of customised systems are in a similar position. The reason that I’m writing about it is because config files (or config data, to generalise) can be a powerful way to drive change to BPM systems if used correctly, because data can be significantly easier to change and deploy than code.

What sort of data might one want to externalise in order to improve BPM agility? A few examples:

  • Field labels
  • Text to be included in reports/alerts
  • Route/path ids and sub-process ids (to permit config-driven re-routing)
  • Data relating to rules (‘if value of loan > $1000 then’ will obviously be more flexible if the value – $1000 – is a variable) if the rules are not already externalised through a rules engine (which would be better still).
  • Data that drives which cases/process instances should adopt changed config values. Should changed values be picked up by closed or archived cases? Should they only apply to new cases, or cases with particular characteristics?

As the last point suggests, there is an important system design aspect to this, regarding when config data should be (re-)read and how much granularity is required to control change impact.

And of course, the data to be held will vary widely, depending both on the business processes and the BPMS used.

For this data driven approach to deliver significant improvements to BPM agility the mechanisms used must be thoroughly tested – part of ‘Agility Testing’. Unless one can be confident that changes to the config data will result in predictable outcomes, abstracting it will provide few benefits, since changes will once more need to be system tested, delaying deployment.

One significant source of comfort for the hard-pressed project manager is that Agility Testing can be carried out after the initial go-live. Not perfect, obviously, but a promise that ‘the system will become easier to change after the second deployment’ will put you ahead of many BPM solutions.

So are config files, as such, the best way of delivering the config data into production? Well, no. Config files are hard to add CRUD functions to (and so are often edited in Notepad – easy to make a mistake) and must be deployed (albeit it’s a very simple deployment).

Using a relational database (RDBMS) helps a lot, in terms of basic validation, and if already in use on your project it will be easy to piggyback on existing or planned resilience and recovery features. However, without a user interface (UI), the standard method of changing an RDBMS is via a stored procedure (for audit and repeatability reasons) and this, once more, will require testing and deployment.

A custom UI would be best. It needn’t be pretty – it will only be used by Admin users or Prod Support so some shortcuts may be possible in terms of corporate standards. However, it will need to be able to provide Read/Update (possibly Create/Delete) functions for quite an assortment of data, so some extendable way to segment the data, such as tabbing, will help. Most importantly it must include an audit log and, ideally, a function that will import an existing audit log, making the changes included.

The reason for the audit log import is procedural. Even though no technical issues should arise from changes made via the UI, there is still potential for the Business to have erred in their change request. So an initial deployment into a User Acceptance Test (UAT) environment may well be required. The UI may be used to directly change the UAT config data – indeed this should be so easy that it could be done with a business representative alongside. Once the Business is satisfied with the result, the audit log can be deployed into Production and run against the Production UI (as an import) with no further testing required, since the resulting Production changes will necessarily be identical to those in UAT.

The BPM project steering committee will need to assess the value of this sort of approach, since it clearly involves a cost. The degree to which config data can be used to drive agility in a BPM solution will vary, as will the value that the Business places on it.

And of course all of this works best within an A-BPM (Agile BPM – here today, Gartner tomorrow!) development framework.

Happy trails…


“Agile” – time for a change

Back in the early 90s we used to demonstrate workflow by building a 3-step ‘Leave Application’ process from scratch, run it from the user perspective, change it, then run it again. The message, reinforced by our patter and marketing collateral, was clear – workflow was Agile. And you can read the same from every BPM vendor today – BPM will result in (or facilitate, support – pick your own weasel word) Agility.

The reality is rather different. Most operations managers have to prioritise the changes they need to their BPM system and these typically get delivered according to a release schedule that is measured in weeks rather than hours.

Why? Well, there’s the complexity – of real business processes that are usually much more complex than their users first thought, defined in tools that are smart but not quite perfect (think workarounds), including multiple integration points (so we may need to change the Java code too), and a user interface that is shared with other systems and environments. So the typical end result of a single process implementation is something that is quite hard to ‘get your head around’ and requires special skills to change. For a multi-process deployment, this gets even harder.  And this complexity of interacting components results in genuine risk – of developer error, of user error (in terms of clearly thinking through what is required), and of deployment error.

These risks are typically mitigated through a series of tests. Systems Testing is specifically designed to test that the changed components work together as expected by the developer. Regression testing will test that the rest of the system has not been impacted by the change. Acceptance testing ensures that the user gets what they (thought they) asked for, and gives them a chance to change their minds. And post-implementation verification testing validates that the smorgasbord of changed components that has been deployed hasn’t destabilized the live system. All of this testing and deployment activity takes time and is more efficiently carried out on batches of process changes, rather than one change at a time. The actual deployment may well need to happen outside of working hours, too, as a further risk mitigation strategy. It therefore tends to happen every few weeks, not daily.

Is this inevitable? I don’t think so. After all, there are some changes that are always carried out swiftly – on the same day or, at worst, overnight. Adding a new user, complete with a required permissions profile, will have significantly complex effects – not only on that user’s system access, but also on reporting, enquiries, and supervisor access – but is typically carried out within hours. Why shouldn’t the same apply to process changes?

The answer, today, is that your current BPM solution – typically a BPM product that has been configured, customized and extended to meet your requirements –will not have been designed to support true Agility.

Does any of this matter? Well, think back to the much-derided paper-based process that the BPM system replaced. The process was largely determined by the contents of a tick sheet stapled to the front of the manila folder that contained the case documents. The users knew the process rules, which were based upon what was ticked and what wasn’t.  So changing the process involved changing the tick sheet (thanks, MS Word) and giving new instructions to the team. It could be done in hours. Sure, the end result was much more error prone, less efficient and lacking in MIS. But are we currently trading enormous improvements in all of these, when we deploy BPM, for a loss in agility?

I believe this to be the single biggest challenge to BPM, with truly agile BPM providing potentially one of the most radical changes that technology can contribute to business processing. As an industry, can we rise to the challenge? What did The Man say….? Yes We Can?

To be continued. Happy trails….


BPM – the future starts here

Having read a few other ‘first blogs’ there seems to be a tradition of keeping the first one light. Well, I’m not going to do that – I want to really get stuck into a current preoccupation – what is the future of BPM? Now I admit that I’m biased to a positive view, having worked in BPM (nee workflow) since 1990, and the title of this blog ‘BPM Futures’ also indicates a certain confidence.

However, having been focused pretty much exclusively on business unit, account and project management for the last 5 years (mostly relating to BPM, btw), I feel in need of a domain refresh. So this is it – I hope you’ll also find it of some interest and use.

So why do organizations buy BPM, rather than other solutions available in the global software market? My answers are listed below, along with some observations that will provide recurring topics for future blogs.

1. The Spot Solution “We want to manage our process better, whilst keeping most or all of the standard data in systems other than the BPM system”.

Comment – essentially a tactical view, this generally happens because the client has calculated that it’s cheaper/easier to use a BPM system than to extend their existing system(s) for the purposes of process management.

Examples- streamlining a home loan approval process that straddles an enterprise customer information system  and  a ‘stovepipe’ home loans system;  managing an accounts payable process in which several thousand employees are occasional users and only half a dozen users – in the a/p section of the accounts department – need to use the a/p module of the accounting system itself.

Challenges and alternatives – if the cost of replacing the existing system were acceptable, most newer ‘core’ systems – such as home loans and a/p – include workflow (if not BPM) functionality appropriate to the specific application, eliminating the need for integration.

Opportunities and the future – merger and acquisition (M&A) activity, if nothing else, will maintain this market for BPM. And of course the gradual adoption of industry-standard interfaces and standards by core system vendors will make the integration challenges easier over time. That said, there is plenty of room for improvement in the BPM solutions offered.

2. The Strategist “We see benefit in using the same process management tool for a wide range of processes”.

Comment – Why? Because it makes skills management easier and allows standardization on common process management features such as process definition, version and release management;  data management/reporting; full cycle process improvement; process simulation. At a more strategic level, BPM is seen as protecting process assets from change at the transaction system level, in particular providing flexibility following merger or acquisition.

Examples – there are many large organizations that have invested heavily in BPM and deployed it very widely indeed, with enterprise process management as an explicit goal. Particular examples exist in retail banking, life insurance, wealth management, P&C insurance and telcos.

Challenges – naturally big, enterprise solutions raise the largest problems; agility/ease of change of processes; adherence to standards; the difficulty of full lifecycle process improvement in a highly diverse technical environment; the role for rules management; and managing the explosion of UI requirements that arise from enterprise BPM deployment.

Alternatives – the current wave of core system replacements in the banking industry and the increasing adoption of eTOM-based enterprise telco solutions seem to pose a medium-term challenge to BPM (and, perhaps to a lesser extent, SOA) as a separate industry, though the concepts should live on within the replacement products.

Opportunities and the future – not only core systems, but also enterprise BPM systems are potential subjects for upgrade, particularly as the limits of ‘first generation enterprise BPM systems’ become apparent to user organizations. There are a number of areas that can benefit from improvement and, at least until core system upgrades are complete (at least 10 years away), demand exists to pay for them.

3. The Document Manager “We started to look at document management – scanning our incoming mail and working from this and our incoming faxes through an image viewer, rather than paper – and realized that workflow and, better, BPM, was a significant value add”.

Comment – this is a common sense and frequently used reason for BPM adoption, particularly in a departmental and/or small/medium enterprise context. The current strategic challenge for document management product vendors – increasing commoditization and therefore lower margins – provides opportunities for solution/service providers and customers alike.

Opportunities and the future – whilst this customer perspective starts a little differently (with document management) it soon returns to the familiar territory of specific process management (what are the scanned documents being used for?) and interactions with other systems. So the challenges are similar to ‘The Spot Solution’ and others.

4. The Administrator “We’re looking for a cheap and cheerful way to improve our simpler, administrative processes that our business requires but none of our core systems provide”

Comment – classic examples used are leave processing; timesheets; expense claims; many aspects of employee on-boarding and maintenance, such as security card applications; and so on. Also processes needed to support short term projects often fall into this general category.

Of course even ‘simple’ processes like these would ideally include integration (do we need to see receipts before approving expense claims? shouldn’t timesheet and leave processes interact with the payroll system?). Most BPM systems will provide the technology to provide this integration, though the services required might make the overall business case tough.

Opportunities and the future – this has always been a popular area for workflow, particularly if integration can be ignored (ie worked around by a human operator). Today the relatively simple functionality required combines naturally with BPM delivered through the web as a service, with power provided by cloud computing. No doubt it will soon be as easy and cheap to build an online process as it is today to build a web site (just start with a template…), providing exciting opportunities for a huge number of users and a (very?) few cloud-based suppliers, whilst soon removing this as a market for traditional workflow/BPM solution providers.

5. The Case Manager “We need a Case Management system”

Comment – this is an interesting one. BPM products have frequently been used as part of, or even as the core product for, case management systems. The reason being the obvious one, that BPM products include much of the functionality required for case management, plus the promise of some interesting ‘extras’ (simulation, full cycle process improvement) and in addition many have a significantly larger user base than specifically ‘case management’ products.

Challenges – there are many points of divergence between case management requirements and those of other processes, and many of these can be seen as gaps in BPM product functionality. Two examples  (1) case management workflows tend to mix status (‘state’) and workflow process rules (so a case may be worked on by a case worker, who may carry out a range of actions, some of which include their own workflows, such as a referral to a colleague; once a defined status/state is reached the case must be sent for approval (another workflow); once approved, the case is returned to the case worker who now has a different range of permitted activities, because the state has changed. Standard ways of representing process flows, including BPMN (BPM Notation – think swim lanes), are very good for workflows and equally clumsy or long-winded for defining and illustrating state changes – ‘state transitions’, in the jargon); (2) case management puts a lot of emphasis on data relationships (so in a legal case a law firm may need to record many clients, witnesses, third parties, etc in respect of related cases – BPM systems tend to most easily support only simple data relationships, at both data storage and UI levels, requiring customization to support more). These ‘gaps’ provide opportunities for service providers to craft custom solutions today, and product vendors to extend their products tomorrow.

For the best summary of the special features of case management I have seen recently, check out the following, published here by the ever-readable Business Process Trends:

http://www.bptrends.com/publicationfiles/07-09-WP-CaseMgt-CombiningKnowledgeProcess-White.doc-final.pdf

Yes, it’s produced by a vendor – Singularity – but it’s a great contribution to the literature on the subject, nonetheless.

Opportunities and the future – demand for, and therefore interest in, case management appears high at the moment. This probably reflects its concentration in the Public Sector, where every new government initiative requires administrators/administration and, usually, case management. And there is no equivalent of eTOM on the near horizon, as far as I’m aware (any ideas from anyone else on that?).

So the market for case management systems that manage case data, processes and reporting flexibly and reliably appears secure for now. Improvements are, again, needed. The recent signs of movement in the OMG, triggered by a draft RFP from the guys at Cordys, might result in some standards in this area, and/or might stimulate a product vendor to push ahead with some real innovation.  Lots of topics here for future blogs…

If you read this far, well done. Not an easy read, and I can assure you, not an easy write either!

Happy trails……..