January in Sydney tends to be a quiet month for business, only really coming to life after Australia Day on the 26th. From the 4th onwards there is a gradual return to workplaces that are abuzz with new projects, new sales opportunities, new marketing campaigns, all scheduled to start towards month end or early February. As a result, energy levels of those at work tend to be higher and minds – less stressed than usual, perhaps – a little more open to reflection. So, with that audience in mind, here is a blog about the bedrock of BPM systems (BPMS), and indeed BPM in general, that is, the reason why any business should bother to use it.
The benefits of using a BPMS can be summarised as:
- Higher productivity
- Faster process times (eg end to end)
- An order of magnitude improvement in process transparency/visibility
- An entry point to a virtuous circle of process understanding, improvement and execution
- Better control over the process (eg through the use of embedded business rules)
- Improved job satisfaction for operational staff using the system
Of these the most significant – in terms of both impact and universality – is undoubtedly process visibility. Businesses that lack a BPMS – or alternatively core or ERP software that fully encompasses their business processes – are like vehicles driven in a thick fog. Data about the past is patchy and unreliable, data about today incomplete and highly reliant upon personal observation, and data about the future seriously compromised. (‘Data about the future’ can be both hard – today’s backlog, how much work is in pending and when it falls due – and relatively soft, for example resource forecasting based upon historic productivity data). Most obviously true of operations teams, this extends further into customer interaction. For example, without accurate end-to-end process times there is little chance of understanding the customer experience except by analysing complaints.
Whilst senior business managers can be passionate in demanding visibility, Boards can demand more. IT investments do not always produce a return and, to continue the fog analogy, we may not be able to see much out the windows but we’ve got this far OK, there is some rear and side vision, and the future’s always going to be largely uncertain.
Which is where the higher productivity benefit comes in handy. Higher productivity in the operations teams can pay for the initial project and the ongoing IT costs, whilst the benefits arising from enhanced visibility (tightly focussed process improvement in operations lowering costs, better informed product development and customer service increasing revenue) are spread across the business.
Some productivity benefits pretty much come with the territory – work distribution, management of pended/diarised work and enquiry handling are all areas where the BPMS will automate previously manual tasks, delivering benefits pretty much by default. Other areas like load balancing (between teams or individuals), exception handling (eg duplicate requests) and re-work management can take a little more work to achieve results. Hard benefits from these and other BPMS features are commonly augmented through add-ons such as automated outgoing document management and integration with core systems, which – whilst initially adding to project costs – can be relied on to provide further productivity improvements.
Faster process cycle times, particularly end-to-end times, routinely arise from a BPMS implementation. Naturally, additional focus and effort can drive further improvement. Where tiered service levels are required, the BPMS can use prioritisation to ensure that outcomes are optimised – something that can be especially hard to achieve where processing is manual.
Process control may be a more or less compelling reason to use a BPMS, and may mean high-end technical control and/or regulatory control. Running business processes through a BPMS will prove to the regulator – and to other stakeholders – that the process has been followed in a specific case (through the audit trail), and is followed in general (through sharing the process definition). Automating business rules will ensure 100% compliance with those rules that are automated and typically will make it harder for a careless or rogue employee to break others. It will also provide control where a – highly automated – process must happen so fast that people are no longer able to participate except on exception, that is, straight through processing.
And improved job satisfaction? Well, whilst rarely at the heart of the business case, the evident satisfaction of employees in receiving better tools for their work certainly makes change management that much easier. And doesn’t everyone want happier employees?
OK – that’s it. Your revision for the day is complete. And as a reward, here is my favourite viral video of the holiday – the Brooklyn Space Program. Any connection with BPM? Just inspiration from their ambition and ingenuity, and perhaps aspiration that we could achieve as much with so little. Maybe we need to recruit some younger project team members?
Another IBM Agility seminar at the Shangri-La Hotel, and some BPM announcements. And in contrast with the sunny spring skies warming Sydney’s harbour (for those of you in the northern hemisphere ) the best bit in here was the cloud.
But first …. Websphere Lombardi Edition is to have drag and drop integration with both FileNet P8 Content Server and Content Manager 8. The extent of the functionality involved wasn’t clear to me – presumably IBM will start with search/retrieval and later move on to others like metadata update and new document insertion? Anyway, further integration will be with Websphere Service Registry and Repository – useful for orchestration purposes – and with iLog, where it will be possible to browse and select an existing Ruleset on a predefined iLog JRules Execution Server.
In the meantime Websphere iLog itself is to be coupled with Websphere Business Events to become Webspher Decision Server, extending IBM’s business events capability, whilst the iLog BRMS SupportPac is to provide Websphere Business Monitoring and predictive analytics integration
All very worthy, but much less interesting than the next piece of news, which was the launch of Blueworks Live. This combines three elements – the Blueworks BPM collaboration community (blogs, wikis); the highly successful (Lombardi) Blueprint process discovery and definition environment; and a new workflow execution engine. All running in the Cloud and, apparently, available through your browser for a test drive from November 20th. (Yes, that’s this Saturday – perhaps one of the software world’s most specific launch dates ever…!).
Now, Cloud-based BPM is hardly new. Cordys was one of the first to offer it globally, and there are niche players too, such as Australian company OpenSoft, which uses open source products to provide integrated Cloud-based BPM to the burgeoning Australian energy and resources sectors. However, Cloud-based BPM from IBM is something else entirely. IBM’s existing mindshare in the global BPM market and its credibility as a corporate Cloud (and FM) provider mean that the interest in this product will be enormous, and as a result it could well be a game-changer for all BPM stakeholders.
The PowerPoint-based demo that followed included a marketing manager setting up a new process for her latest marketing initiative. Yes, that’s one process for one case/process instance. And if the Powerpoint is to be believed, it only took her a few minutes.
How can this fail? The CIO’s happy because it’s SaaS; the Board because it’s IBM; the Ops Manager is comfortable because its running in an IBM Datacentre; the process improvement people have Blueprint to play with; the IT teams can focus on integrated, production BPM system work; and best of all the Business can replace its endless email trails with easy to access, auditable business processes.
So what next? Well, here’s a prediction – Blueworks Live will do for business processes what Microsoft Sharepoint did for enterprise content – it will get everywhere. That means a step change in awareness regarding BPM (how many business – or even IT – people knew of ECM before Sharepoint?) and huge opportunities for BPM professionals to sort out all of those ‘home grown’ processes. Bring it on!
I’m running behind with my blogging. It’s now several weeks since the Pegasystems Business Process Symposium took place here in Sydney, however whilst not quite ‘hot off the press’ the event is easily worth reporting on, even now, for its excellence at three levels – case studies, product and philosophy.
Pega’s philosophy – or at least my understanding of it – puts top priority on ease of use for both developers and end users. This means plenty of functionality that is easy to put together into processes, and thereafter just as easy to maintain. This is a big ask – business processes tend to be complex, and the technology set required to support them is fairly broad – and can only be achieved through a pretty stubborn focus by the vendor.
This philosophy came across quite graphically in a Q&A session towards the end of the day. Alan Trefler, CEO and founder of the company, was asked why Pega wasn’t providing more extended support for custom Java user interface development. Now 9 out of 10 company representatives put in this position would have (a) spoken at length about the support that was already in place and (b) at least implied that further and even more exciting developments were on their way. Not Mr Trefler. He told the questioner that custom Java code was far too slow to develop to be useful in BPM deployments – instead, it was the responsibility of BPM vendors to provide a UI builder, fully integrated with the core product, that was fit for purpose. The Pega roadmap? It would continue to improve the built-in Pega UI builder …. and if any customer or prospect felt that there was functionality lacking in it, he would be delighted to make the investment necessary to develop the product further.
Now that’s focus. I have been responsible as a manager – and, going back a few years, as a developer – for BPM implementations with both flavours of UI, native and custom built (ie Java/.Net). From a productivity point of view the native (BPM) UI wins hands-down, both because it is simpler to use and because a single developer can define both the process flow and the accompanying screens together. There is no need for an interface, two sets of data definitions and, worst of all, two different developers each with a slightly different skillset and understanding of the requirements. The native UI has only one catch – without real commitment from the vendor, the UI builder tends to have significant functional gaps. Close those gaps and you have a winner.
On a different topic, he was asked about the rationale for the Chordiant takeover. The answer was interesting in that it emphasised Chordiant’s core differentiator, its predictive and adaptive capabilities, which support more intelligent management of (eg) customer retention, cross-selling and fraud processes. Applying this technology to end-to-end processes, rather than simply the CRM front end, has the potential for significant value-add.
It is perhaps this combination of a practical, experience-based development focus with innovation where it can really make a business impact – rather than simply following the latest technology trend – that explains why Pega tends to have rather interesting case studies. On this occasion it was Mike Efron, eBusiness Manager from Wesfarmers Insurance who spoke about using Pega to provide a rules- and process-based consumer portal through which Kmart Tyre & Auto Service is selling white-labelled personal lines insurance products. The key here was ‘building for change’ – Pega’s slogan, which this project realised through defining specifically those aspects of the solution that were not required to change – and then leaving it to the system’s designers and the system itself to ensure that everything else could change. He told the audience that once Kmart Tyres was safely live, it took the team just two weeks to change the system sufficiently to support a second ‘white label’ customer.
A second case study that was mentioned at the event was British Airports Authority. This is the sort of innovative case study that refreshes one’s interest in BPM. How many BPM solutions have as their primary input channel not email, not scanned mail … but radar? Rather than my re-writing it, check out Gartner’s take on it here.
The final topic is of course the latest product news. This is well-documented on the Pega site, and the highlights for me were:
– A new Case Management version of the product with a slick user interface and a process architecture that includes effectively unlimited nesting of cases. So a motor claim can include separate sub-processes for vehicle repair and personal injury; the personal injury claims can include separate processes for the several individuals involved, each with multiple different types of injury, and so on. All neatly tied together into the Case Manager’s desktop.
– Other Case Management features include ad hoc tasks, delegation, support for multiple parties and related cases, correspondence management and reporting.
– New Process Designer features that are used for Process Discovery. These are similar to those introduced by a number of other vendors in recent years with the important addition of requirements traceability. I understand this is made available as a cloud service to the Pega Developer Network.
– Project management tools (eg for task, risk and issue management, and including wiki and twitter-like functionality) that use Pega core technology and can be configured to fit the desired SDLC approach (waterfall, agile etc). This looks well-developed enough to use, though the overlap with third party systems is obvious. It’ll be interesting to see how this area develops.
Overall this was an excellent event, showcasing a product that is increasingly differentiating itself from its peers, and was much enhanced by the presence of the CEO himself in Sydney.
I attended an IBM ‘Business Agility’ workshop at Sydney’s Shangri-La Hotel yesterday – the first IBM event to feature BPM that I’ve managed to get to since the Lombardi purchase. It was a Websphere event, which meant that it included Lombardi and excluded FileNet, so I was a little concerned that the BPM section might be dominated by talk of process orchestration and middleware layers, rather than end-to-end processes.
I needn’t have worried. The Websphere team has embraced IBM Lombardi (as we must now know Teamworks) with great enthusiasm, and started a day of real (yes, live) demonstrations with several that showed off Lombardi to good effect. Point and click SLA setup; process stats (such as wait or execution times) displayed through a mouse-over in the unified process model-define-simulate view; colourful monitoring views populated with whatever defined field you required – just click that checkbox on the field definition dialogue; and so on.
There were also Websphere Dynamic Process Edition (Process Server, as was) demos. The emphasis there was on architecture, integration and transactional integrity. The latter featured a high-wire demo, with 100 updates to two databases on separate servers, interrupted by the speaker who pulled out the connecting cable to the second (Oracle, as it happened) with a flourish. 56 updates had been processed successfully and, to the relief of all, the other 44 were in a ‘failed’ queue, from which they were dispatched – to a successful completion – by a single click on the ‘resume’ button once the cable was re-connected. We were told that the product was unique amongst BPMS’s in fully supporting two-phase commits, with resume, restart and ‘compensate’ options for system administrators.
All of which provided – to this viewer – a pretty clear, if unspoken, message. For the human side of BPM (the typical financial services back office, perhaps), Lombardi is IBM’s answer, packed with business-friendly features. Alternatively, if the business depends on multiple integration points that require sophisticated sequencing, error handling and recovery options – bullet proof delivery, in other words, WDPE does the job (telco provisioning comes to mind). And for the business that needs both, well, integration between the two is currently available through web services, with work under way to convert Lombardi to IBM’s Service Component Architecture, the basis of the Websphere product range.
One other demonstrated feature of WDPE that I liked, by the way, is the easy way in which routing rules can be changed without re-deploying (or even opening for editing) the process itself. This seems like an obvious feature, but by no means all BPMS’s share it. Isolating the change eliminates the need for system and regression testing and even (depending upon the process design and one’s perception of risk) UAT. Now there’s something that offers Business Agility.
Conversations that include the words ‘Pegasystems’ and ‘buy’ in the same sentence tend to cast Pega as the vendor, so this move has the immediate benefit of surprise.
I’m not going to attempt too much learned reflection on the purchase, since I know very little about Chordiant. However, I do know that CRM / BPM mergers aren’t always easy – Staffware bought a small US CRM vendor in the late 90s (whose original name escapes me now) and my own sense was that in the end this was more distraction than synergy. It’s hard to make a world-beating BPM system and, no doubt, the same goes for CRM. Trying to maintain both of those market positions whilst simultaneously promoting an entirely new ‘CRM+BPM’ market position as well requires a near-superhuman organisation of the engineering teams, not to mention sales & marketing.
What leaves me with some interest and excitement with Pega is that their vision has for some years been focussed on all-round excellence in BPM with an emphasis on BPM’s original mission – providing systems that render the exceedingly complex (UI +process+rules+integration+data) sufficiently easy for business process deployment to be affordable and – crucially – for subsequent process change (=agility) to be realistic.
Given this track record, it could just be that Pega will use Chordiant’s technology to push ‘Build for Change’ in quite new and original directions. BPM needs a lift at the moment – perhaps from a ‘should buy’ to a ‘must buy’ – and the vendor that delivers this transition will be richly rewarded. My money to date has been on the big players – IBM, Oracle, SAP – simply because of the size of the challenge in terms of both re-engineering and re-launching into the market. However, perhaps Pegasystems will leverage its intense focus to take the lead, showing the rest of the market the way.
Let’s hope that the fact of the purchase is the least of the surprises Pegasystems has in store.
For more facts and figures, check out this article. Good stuff.
A full house today at Sydney’s Sofitel Wentworth for the first 2010 Tibco User Group meeting. Networking – with cold beer, very civilised – was followed by the corporate positioning pitch and then on to a tour of the 2010 roadmap. Whilst many individual points were of interest, the overall message was clear – we’re all business process specialists now, whether we use BPM tooling as such (Tibco’s iPE) or Complex Event Processing (‘BusinessEvents’) as the ‘pointy end’, the ‘stack’ is all about business process outcomes.
Highlights on the traditional BPM front include new Organisational Modelling extensions to iPE; new Forms options including Google Web Toolkit and Windows Presentation Foundation (the key here will be the depth of integration with the rest of the stack); and new process optimisation functionality arising from the convergence of iPE and Spotfire (the latter being a rich/easy to use BI tool that threatens to be as much loved by the business, and as hard to control by the IT department, as Sharepoint). And tibbr, Tibco’s corporate answer to Twitter, was also featured – very compelling, it takes the concept of ‘following the customer’ to a whole new level.
The session was topped off with a case study from Vodafone Hutchison Australia, a 2009 merger that claims 27% of the Aussie mobile market, and growing fast. Presentations like this, whilst always well-meaning, can be a bit repetitive – we’ve all heard similar ones before. This one stood out in two respects. Firstly it related the replacement of VHA’s core provisioning and customer service system, handling 100k+ tx/day, with the Tibco stack in just 6.5 months – this old hand was impressed.
And secondly, VHA used BusinessEvents rather than iPE, despite the latter having a significant track record in the telco space (very high volume provisioning, MNP and others). This remained unremarked in the presentation, and I was unable to reach the front of the queue to speak with their Architect afterwards. I did find myself speaking with one of VHA’s competitors though, who confided that if he had the choice he would really like to use both products, with BE orchestrating iPE. A topic I shall try to delve into further in future blogs….
Isn’t it time we re-thought the definition of BPM? It seems to be getting increasingly jelly-like – wobbling and spreading to encompass every possible interest group. Check this out – the ‘official’ definition (at least until the editing debate settles down) from Wikipedia: “Business process management (BPM) is a management approach focused on aligning all aspects of an organization with the wants and needs of clients.” Presumably contrasting sharply with previous generations of business management methodologies, which focussed on aligning all aspects of an organization with the wants and needs of small furry animals. I love Wikipedia, but if this is the wisdom of crowds, then civilization is surely doomed.
My recollection is that the term BPM came into usage in the late 90s as a way for new entrants (Savvion, Lombardi, Metastorm and Ultimus come to mind) to the existing workflow automation market to differentiate themselves from the incumbents. A key aspect of these newcomers was a serious attempt to make the process definition environment more friendly and useful to business process analysts/modellers, for example with simple BPMN flowcharts and built-in simulation.
However, the newcomers could not emphasise this aspect alone, partly because the incumbents already used graphical process definition (albeit, they would argue, a less business friendly version), and partly because their prospective clients were also concerned with other features, such as ease of integration and reporting. So “BPM” became associated not only with built-in business modelling/simulation but also better integration and reporting (think Business Activity Monitoring) – something not contested by workflow automation incumbents, since some already had excellent integration and they could swiftly match any reporting improvements. Very quickly, everyone sold “BPM” products.
And everyone bought them. The tremendous success of BPM technology, particularly in banking/financial services, telcos & the public sector, over the last 12-13 years (since the explosion of new “BPM” products in the late 90s) has had a further, diluting effect on the terminology. BPM projects are now routinely enterprise-scale, and are therefore attracting a spending level significantly higher than most process improvement /modelling initiatives. This in turn means that a much wider constituency of professionals wants to be involved in BPM projects – and often rightly so, given that enterprise roll-outs do require a wider range of skills, particularly in relation to business (process) analysis. Unfortunately some of these folk are taking positions in relation to BPM terminology that owes much more to their history in process improvement than to the technology that created BPM.
Is this just technology bias? Well, consider your favourite BPM project, and imagine the impact if all of the BPM technology was suddenly removed. What new analytical or process improvement method would remain to distinguish the activities of business improvement folk today from those you might have seen 15 years ago? Six Sigma, Lean, more general process improvement techniques are tremendously important – but have they changed so much in recent years that they constitute a new business management methodology called BPM?
So is BPM defined by technology alone? Think again of your favourite BPM project, and this time remove the entire concept of process improvement, Six Sigma and Lean. What are you left with? I suspect something that looks a lot like workflow automation, at least in its primitive form – a process defined and automated … then the project finishes and the process improvement team (suddenly reappearing) pulls out its Visio charts and starts to negotiate future change with the IT department.
If BPM is to have any substantive and paradigm-changing meaning, it must include both technology and process improvement in a way that reflects their roles and illuminates their synergies. A suggestion:
BPM is the superior state of process management attained when business process analysis and improvement activities are supported by technology workbenches that are themselves deeply integrated with the systems in which the processes are to be executed.
This definition addresses the relationship between BPM, process modelling/simulation, workflow and ERP (and other types of ‘Core’ systems). The following statements become true:
• “BPM” products that do not include deeply integrated workbenches for process modelling, simulation and analysis are not BPM – they are workflow. Nothing wrong with that – many, perhaps most, of the world’s “BPM” implementations to date probably fall into this category and they have provided significant return on investment to their customers.
• Where a workflow product does include deeply integrated workbenches for process modelling, simulation and analysis it is indeed BPM or, perhaps better, ‘BPM-enabled workflow automation’.
• ERP (and other) systems that include deeply integrated workbenches for process modelling, simulation and analysis may also be classified as BPM – or perhaps ‘BPM-enabled ERP’.
• Process improvement professionals can state that they are practicing BPM (or process improvement in a BPM environment) if and only if they are using BPM-enabled technology. They might be practicing BPM in relation to processes executed in a workflow automation or an ERP system.
Such a definition would set a standard for all stakeholders and provide a target for both business and vendors to aim at, with a vision of process improvement and execution progressing in harmony that is both radical (in the context of where the “BPM”/workflow journey has actually been over the last 20 years – whilst a number of products would pass the BPM test above, many solutions based on these products remove the levers that would put process improvement professionals in the driving seat) and ‘back to basics’, in that this is very much what the pioneers of workflow automation envisaged in using ‘graphical process definition’ in the early 90s.
It remains a compelling vision, and one that could drive competitive advantage for those that adopt it in the decade to come. The first step is to collectively recognise the vision and the terminology under-pinning it for what they are, and discard all wobbly-jelly BPM definitions along with sub-prime loans and easy credit, as relics of the decade we’ve just left.
Lombardi’s particular strengths in relation to other IBM BPM products are its fully built-in process simulation function – including use of historic ‘live’ data – and the genuine business agility that arises from the 360° functionality (integration, rules, process, user interface) that is managed from a single development environment. Whilst it shares one or both of these with other ‘pure play’ BPM vendors, Lombardi has won enough gongs from industry analysts and others in recent years to regard themselves as leaders in the ‘pure play’ pack – no doubt a reason for IBM’s buy decision.
IBM says that it will be targeting Lombardi at ‘departmental’ and ‘human-centric’ solutions; it references speed of build (‘fast start for immediate value’), and, intriguingly, is looking in 2010 to ‘leverage Lombardi to expand BPM offerings in emerging markets’.
All of which suggests that IBM’s selling price for Lombardi products is likely to become the epicentre of activity for IBM’s BPM strategists over the coming vacation. Departmental and emerging markets – both great targets for Lombardi’s technology – are not known for big spending. Competition in these markets from still-independent (and local) BPM vendors will be intense, putting downward pressure on prices. Meanwhile technical differentiation with Websphere BPM and FileNet (in particular – if ‘content-centric’ BPM isn’t meant for humans, who is it for?) may well be less than IBM’s initial positioning suggests. As a user of the Websphere stack, or of FileNet content management, considering moving on to BPM and wanting to stay with IBM, why wouldn’t you buy Lombardi, particularly if the price is attractive?
IBM’s internal product management challenges aside, this purchase is likely to produce more winners than losers. Those who have invested in Lombardi already (an elite club here in Australia to date) and those considering doing so will be pleased to have their decision underwritten by IBM, with the prospect of improvements in support and service options to come; similarly, this is good news for Websphere and FileNet customers who have yet to invest heavily in BPM – it expands their choices too. And with what should be a significant boost to their market, some of the biggest winners could be Lombardi service providers. Watch out for skills shortages.
No doubt FileNet customers who have already invested in BPM will be looking for reassurance that their product roadmap will not be adversely influenced by the Lombardi purchase. In particular users of the Business Process Framework, FileNet’s equivalent of Lombardi’s forms builder, will be asking questions of IBM regarding future BPM forms developments; questions that may also interest some Lombardi customers. Does Lombardi’s ‘human-centricity’ imply that one day all IBM BPM users will use their forms? Or will an entirely new forms paradigm be available to all FileNet, Websphere and Lombardi users? My bet is on the latter.
And whilst IBM technologists ponder the future of enterprise BPM forms, where business rules (iLog?) drive super-flexible UI components (why do BPM vendors so readily refer to ‘orchestrating’ SOA components but never ‘orchestrating’ the user interface?), perhaps their thoughts will turn to the product that will really change the market. One that IBM is prepared to position as both enterprise level and – like Lombardi – truly agile.
btw if you want to read the IBM announcement, you can find it here. Onward links are top right – the ‘FAQ’ document expands considerably on the press release.
I was recently discussing BPM agility with a senior IT manager, when the topic of configuration files came up. “Many of my colleagues don’t like them” he said “we have hundreds of them, and they don’t like changing them because they don’t know what the consequences will be”.
It’s easy to smile at this, but practically all guardians of customised systems are in a similar position. The reason that I’m writing about it is because config files (or config data, to generalise) can be a powerful way to drive change to BPM systems if used correctly, because data can be significantly easier to change and deploy than code.
What sort of data might one want to externalise in order to improve BPM agility? A few examples:
- Field labels
- Text to be included in reports/alerts
- Route/path ids and sub-process ids (to permit config-driven re-routing)
- Data relating to rules (‘if value of loan > $1000 then’ will obviously be more flexible if the value – $1000 – is a variable) if the rules are not already externalised through a rules engine (which would be better still).
- Data that drives which cases/process instances should adopt changed config values. Should changed values be picked up by closed or archived cases? Should they only apply to new cases, or cases with particular characteristics?
As the last point suggests, there is an important system design aspect to this, regarding when config data should be (re-)read and how much granularity is required to control change impact.
And of course, the data to be held will vary widely, depending both on the business processes and the BPMS used.
For this data driven approach to deliver significant improvements to BPM agility the mechanisms used must be thoroughly tested – part of ‘Agility Testing’. Unless one can be confident that changes to the config data will result in predictable outcomes, abstracting it will provide few benefits, since changes will once more need to be system tested, delaying deployment.
One significant source of comfort for the hard-pressed project manager is that Agility Testing can be carried out after the initial go-live. Not perfect, obviously, but a promise that ‘the system will become easier to change after the second deployment’ will put you ahead of many BPM solutions.
So are config files, as such, the best way of delivering the config data into production? Well, no. Config files are hard to add CRUD functions to (and so are often edited in Notepad – easy to make a mistake) and must be deployed (albeit it’s a very simple deployment).
Using a relational database (RDBMS) helps a lot, in terms of basic validation, and if already in use on your project it will be easy to piggyback on existing or planned resilience and recovery features. However, without a user interface (UI), the standard method of changing an RDBMS is via a stored procedure (for audit and repeatability reasons) and this, once more, will require testing and deployment.
A custom UI would be best. It needn’t be pretty – it will only be used by Admin users or Prod Support so some shortcuts may be possible in terms of corporate standards. However, it will need to be able to provide Read/Update (possibly Create/Delete) functions for quite an assortment of data, so some extendable way to segment the data, such as tabbing, will help. Most importantly it must include an audit log and, ideally, a function that will import an existing audit log, making the changes included.
The reason for the audit log import is procedural. Even though no technical issues should arise from changes made via the UI, there is still potential for the Business to have erred in their change request. So an initial deployment into a User Acceptance Test (UAT) environment may well be required. The UI may be used to directly change the UAT config data – indeed this should be so easy that it could be done with a business representative alongside. Once the Business is satisfied with the result, the audit log can be deployed into Production and run against the Production UI (as an import) with no further testing required, since the resulting Production changes will necessarily be identical to those in UAT.
The BPM project steering committee will need to assess the value of this sort of approach, since it clearly involves a cost. The degree to which config data can be used to drive agility in a BPM solution will vary, as will the value that the Business places on it.
And of course all of this works best within an A-BPM (Agile BPM – here today, Gartner tomorrow!) development framework.