Monday, August 22, 2011

Key Learnings - Using EDA to implement the core SOA principle of "loose-coupling"!!!

A lot has been said about how SOA and EDA are unique "architecture styles". It seems like only one or the other architectural principle is considered in proposing architectural solutions. However, there is a distinct benefit to using both paradigms in unison in solving business problems!!


Business events are the "core concept" that drive any EDA implementation. On the other hand, decoupling business applications and the business functions/ business processes embedded in these business applications is the core theme of SOA. SOA implementations rely on the use of standards based web services technology stack and that of canonical business documents (XML) .

However, it is my contention that an enterprise that does not invest in the web services or ESB technology can still leverage EDA style business events to implement loosely coupled business services provided it makes an effort to analyze its' business events and creates canonical representations of these key business events. This could mean defining business events that encapsulate a business concept that have an associated business concept state indicator or a business action indicator. Further, these business events can be used to trigger constructs such as event handlers that act as a facade or layer of indirection to execute a business function via the use of an application API.

It must be noted that the terms event producers and event consumers or publishers/ subscribers are being used loosely to denote the initiator of the event and the owner of the business behavior that "knows" how to deal with the event. In a SOA realm this would be the service consumer and the service provider respectively.

The key to this model in leveraging SOA is the use of self-describing canonical business events that are subscribed to by independent event listeners. These event listeners and/or event handlers that are delegated to by these event listeners help insulate the event producers and event consumers from the complexity of knowing how to interpret the events. Here the event producers/ publishers and event consumers/ subscribers are decoupled from one another via the use of canonical business events as well as messaging technologies.

Either of the two layers i.e. the event publisher or event subscribers can be altered long as the contract is adhered to in terms of the canonical business events. Also, messaging technology oriented configuration consoles allow the definition of the event publisher/ subscribers to be connected via metadata as opposed to hard-wiring these in code. Event handlers act as event adapters in that these could translate the event and invoke the required API call to deal with the event. The event handlers act as a business facade that hide the workings of the business application and allows the business application to change without affecting the event producers.

To recap, if loose-coupling is a core SOA principle that promotes business agility then the use of event handlers invoked using messaging technology and canonical business events can be used to deliver this goal. The enterprise does not need to invest in a SOAP stack right away to achieve this goal. If desired technology standardization and interoperability can be introduced at a later phase by turning these event handlers into web services. This two phased approach defined above enables the enterprise in pushing off investment in the technology stack to a future phase without sacrificing business agility and/or offering novel business capabilities.
 
Please feel free to drop me a note.
thanks.
surekha -

Tuesday, August 09, 2011

Launching MDM as Part of Larger Initiatives

Let me walk through a situation that may be common in many organizations. You as an IT visionary recognize the need for MDM initiative but are having difficulty appropriating funding due to cost concerns or simply as a result of organizational inertia. You decide to ride the coat tail of another major initiative that is somewhat related to the successful implementation of the MDM. Is it a good idea? I would suggest yes, but be careful about setting expectations and having the right level of buy in or your successful execution of MDM initiative may be perceived as not so successful. This may happen due to personalities involved or not having MDM implementation risks included in your schedule or cost estimates. You can be sure that you are going to run into data issues, and amount of data you wil have to deal with will always be more than you think. All these issues may impact the larger intiative and will reflect badly on the MDM effort.

Make sure all stakeholders understand that while MDM is expected to support the larger initiative, it has its own benefits and its goals are much larger in scope. Also, include additional time and cost to mitigate the risks.


Ashok

Saturday, July 02, 2011

Business Event Subscriber Responsibility?

Thought I would write a comment about what are some of the expectations of a business event subscriber.  These would be rules that the business event publisher could depend on.  These sets of rules form the basis for Event Driven Architecture (EDA) based implementations.


Subscribers of business events and business alert notifications often assume that the business event publisher is responsible for insuring that duplicate events and any repeat alert notifications are suppressed. However, to protect itself the subscriber has to be able to "analyze" the business event to determine if the erroneous event/ alert was sent by the business event provider, messaging architecture or the enterprise service gateway. The subscriber (business application, business process) would have to "know" when to discard the business event as being a duplicate event, as well as when to re-apply the same business event, which could have been issued as a result of change in the business policy or due to an increase in the business alert thresholds.  Knowing when an event transmission is real versus a false notification insures that the outcome of the EDA implementations are valid.

For instance, if a "low inventory alert" is received by the Purchasing Process it may react to this alert notification by transmitting a PO to the supplier. However, if the same Purchasing Process gets a second alert a few seconds or a few minutes later chances are the Purchasing Process may choose to ignore this. Not examining the alert more closely may cause it to erroneously discard the alert.  It is possible that in order to optimize and tune the response processing the subscription process just looks at the product and quantity to de-dup alerts received within a certain period thus causing the second alert to be discarded. The business impact of ignoring the second alert was failure to increase in the "minimum product level". The bottom line, the subscribing process has to know how to differentiate between fundamental business rule changes and duplicate alerts and not have a compute process optimization overrule business policies.

In general, in an event driven architecture realm the disconnectedness of the publisher/ subscriber pair places a burden on both parties to insure that there exists a mechanism and rules to enable "semantic" translation of the message payload that carries the business event or the business alert.  Without investing in the definition and analysis of these semantics benefits of the loose coupling EDA architecture would deliver a scalable technical architecture model but would not yield the right business outcome.

It must be noted that standards such as WS Eventing, WS Base Notificaitons and WS Notification help to define WSDL and XML Schemas for defining the mechanics of loosely coupled and inter-operable EDA implementations but the business semantics and rules in constructing and consuming the events are not really well laid out.  Not having had exposure to specifications such as ebXml and OAGIS business documents I assume some of the B2B interactions may have more mature semantic definition but for the most part I think these rules of engagement may still need to be fleshed out by the participating parties.

Thoughts and feedback on this topic would be very useful.
surekha -

Sunday, May 29, 2011

Service granularity and service reuse - why information semantics are key?


In the following blog which refers to the Amazon CEO's "letter to the share holders" one is truly amazed to read about how the Amazon team constructs a single product detail page from a combination of 200 to 300 services!!

Amazon architects may have truly found a way to identify the right level of service granularity to combine these into composite service offerings - without causing the information in the combined service to be distorted. This so called semantic dissonance has led many of us architects to create coarse grained services which preserve the quality of information but at the cost of reuse!

This is the the balancing act of how to define a service with the right grain of information encapsulated in it such that once it is combined with another service the combined service is still able to deliver meaningful information at the same grain of the originally combined services.  This is what we mean by semantics that govern how to combine two or more granular services to create a meaningful composite service which continues to encapsulate information that is valid and accurate.  It may be technically possible to create a composite or a single course grained service starting from two granular services (long as the canonical models/ payload have common elements) but the combination may overlook key business constraints, business rules, regulations and algorithms that render the final response invalid and inaccurate.  

A very simple example, combining product sales of a region with product marketing dollars to yield a service that delivers marketing efficacy may be a great new service concept.  However, if the product sales service does not provide information about the tenure of the product company by region or the presence of similar competitor products by region then the marketing efficacy service is not able to account for lift in the sales accurately.  The reason being marketing efficacy service cannot appropriately reflect the efficacy of the marketing dollar as being attributed to the "novelty" factor (where in the product is the first to enter the market in this region, or else if the product has a very attractive "entry price point" as it's debut to this new region).  Both of these artificially show that the marketing campaign strategy is far more successful than it might really be.  The price and novelty had more to do with the sales lift. 
 
The use of information by the business to make strategic decisions and the study of information flow across lines of business and the alignment of business information to strategic business activities is key to business architecture. By extension we find that it is in the realm of business architecture to govern service interactions and the creation of composite strategies to guard against semantic dissonance. 

Please share with me your thoughts.

surekha -
 

How to take a Transaction Based Vendor Relationship to the next level?

In the following blog, I talk about "What is the definition of a strategic partnership with a vendor?" and outline a few thoughts on how large enterprises should engage with their vendor partenes and what the expectations of such a relationship should be.  A fellow blogger of mine expressed his "dissapointment" with vendor partners and their focus on selling new product as opposed to helping maximize the returns from the existing IT Assets.

I whole heartedly agree the pressure from vendors can be exasperating.  I was thinking if there is a way to turn this around. What if  as an example, your IT product vendor were told that they would have to provide expertise (at no cost) to solve current interoperability issues or resolve performance bottlenecks between their product and one other product while show casing a operations admin and management/ monitoring tool - OAM tool?  This would be their elite product engineering or premium consulting service participating on-site mentoring your team and not the "support" organization which has to trouble shoot remotely.

Another example, if this is a package solution vendor with a long deployment cycle then the vendor can be asked to offer (again for free) analysis tools or processes or best practices and other accelerator kits that improve speed to market for their product, with the chance to demonstrate components of their next generation product.  Again, the vendor has to part with IP (intellectual property) and address your pain point with the chance to demo a product that would or could be a fit in "your" enterprise.


The result would be the ability to extend or enhance the life of existing investments, while evaluating the vendor product, vendor processes and expertise in a real life scenario.  Most importantly however, you are putting their resolve to test as to whether they are able to or willing to be your strategic partner and not just a transactions based product vendor.

Of course, I would be remiss in not stating that VMO, PMO and your legal departments would have to help with insuring this was a fair and scientific discovery process and also, that the licensing models were conducive to both parties.

As always your comments are welcome. 

Surekha -