CEP Application Design - Complex Event Processing

Hello,
I am quite new to the concept of CEP, i am going through the Getting started and the examples which are available as a part of the application.
My requirement is to work on an application which is going to get huge amount of data which can be stored in the Database or can just reside in the .csv files
this data is going to be the event source from CEP perspective.
Now we need to apply some pattern matching on this data and if the rule matches then invoking the BPEL or some web services.
Can i expect some help on the architecture design to gain the maximum performance.
I would also appreciate if someone can share some samples done on Database Adapter and Also Invoking the web services.
Many Thanks in Advance 

Hi,
What you are doing sounds entirely reasonable and we have other customers doing similar things. I suggest you dive in and ask questions as you go along. Most of the CEP engineers monitor these forums, so help should always be at hand.
Cheers
andy 

Hello Andy,
My basic question is what is the suggested pattern to handle the large data. I would say my requirement is more like massaging the source data with applying some rules and then creating another slice of data on which either report can be generated or action can be taken.
I was thinking if we can create a processor node which can read the database and then filter the data which can be put in some event sink. As of now i am not too worried about invoking the BPEL since that should be possible in many ways.
Please suggest.
Thanks, 

It would help to know what kind of filtering you are expecting to do. You might find that putting the data in Coherence helps with this use case substantially.

Related

FDM Automatic Load

Experts, I have a planning client. We are using FDM.
Generally what we are doing is trying to get nightly loads of data using FDM for planning from EBS. Upon researching this option we were lead to the batch loader. Now the batch loader seems to be the best option if we were using data files and scripts for FDM. In our scenario we are going directly to the data source and not using data files.
My questions:
1) is there a way to have FDM just automate this process without the need for batch loader?
2) is going directly to the source not the best solution and should we revert back to using data files? 
The information that you have recieved is somewhat incomplete. The question you are asking is someway design related and may not be best answered via a forum - keep this in mind as you try to learn more about the product. Design questions are best handled in a formal design dession to really drive out all of the issues.
To answer your question, it depends. The automation of the FDM workflow process is best driven by the batch loader. Trying to create custom scripts to replicate this functionality is frankly inefficient. The question regarding going direct against the data source is largely dependent on client requirements. You can automate this process however. 
Thanks Tony,
Per the client requirements we had to go directly to the source and I agree that a script is inneficient. I was asking the design question just to add more detail to the post. I thought it was an important point to make (direct to the source or using import files).
However, automating this process in FDM without using a script or batch loader is not very well documented in the manual. Where would you recommend to start reading or learning this process? 
Batch loader is your best option. Unfortunately, learning it is largely a matter of "wrench time" / your network. Play around with it and as you have questions, feel free to post back.
Good luck. 
So if batch loader is my best option does that mean that we will then have to produce files from EBS instead of going direct? I am confused here as it seems to me that in the architecture that you will either use data files or go direct to the source. From my very junior perspective it seems that going to data files over directly accessing the source is a step back and it will add another gear in the process.
Thanks for your feedback you are definitely getting me pointed in the right direction. 
Hello,
Flat files are requires to trigger the process ... but you can use a 'blank file' to trigger the process for a location that has an Integration Script or Source Adapter attached.
Thank you, 
Hi tony,
I have some problems regarding FDM,maybe you can give me some help.
1. In FDM/Tools/Script Editor, we'll edit the script direct which data table should be connected from DB, for example:
'Connect to Northwind SQL Server database (our data source)
Dim strConn
strConn="Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;"
strConn=strConn & "Initial Catalog=TestDB;Data Source=bivision;uid=sa;pwd=sa;"
'strConn="Provider=OraOLEDB.Oracle.1;PassWord=cqboa;Persist Security Info=True;User ID=cqboa;Data Source=nsoa;Extended Properties=PLSQLRSet=1"
in part of above,we should key the UID and PWD, my question is if we can convert the password("pwd") from plaintext to encrypted? If we can ,how do it?
2.Could you please give me some informations regarding the performance of FDM, like as in a certain server enviorment, a certain number of records or size of data, or in different number of user concurrent situation,how the FDM performance.
3.FDM supports intallation in 64-bit enviorment?
I'll appriciate for you kindly help.
My email: jeffery.zhang#yahoo.com
Best Regards,
Jeffery Zhang
Edited by: Jeffery Zhang on 2010-11-23 下午11:27 
Hello,
To directly answer your questions:
1. No you can not encrypt that information. The information is stored as clear-text.
2. Unfortunately there are no performance metrics to state. They are very difficult to pin down and vary on many many different factors. From HDD Performance, to RAM Performance, to CPU Speeds, to Network Speeds and to what ever else may be going on inside of your corporate network to slow down transmissions.
3. Yes.
Thank you, 
Hi W,
Thanks for your help.
However, in my opinion,if FDM supports API then we can through an encryption mudule to encrypt the clear text. What do you think for using API and encryption mudule to solve this problem, I don't know if FDM support API and how does it.
I understand it's difficult to evaluate FDM performance under different deployment enviroment. Could you please send my some implement case or use materials?
Thanks in advance.
Email: jeffery.zhang#yahoo.com
Regards,
Jeffery 
I believe there are encryption decryption procedures available on the net. Using that you could first encrypt the password and then use the function to decrypt the password at runtime to pass as parameters to the connection string. Contact me at anand.harith#gmail.com if you need more information. While theoretically it is possible to reverse engineer the process to get the clear password, but it will be an effort.
Regards,
-Anand Harith 
The automation of FDM is easily accomplished through FDM's scripting capabilities. The scritping engine of FDM is very powerful and I personally have developed a number of solutions for clients that push data to 2 different targets (as one process) while fully automating the work flow and emailing the detailed status of the workflow including PDF reports and copies (if applicable) of the intersection validation report.
The point of my post is to share the knowledge that FDM has very advanced capabilities within its own product suite. Hopefully the FDM community will continue to learn about how to more fully utilize the product and additional "cool" solutions will continue to emerge. 
I have an easy question about the FDM automatization :
For a daily batch, this automatization is possible thanks to the FDM API ? If the answer is no, can you explain to me the steps to follow ?
Thanks for the answer.
PS: I don't know the FDM tool :) and the version is 11.1.2 
Can you please take the time to review this post:
Forum Posting Tips & Etiquette

BPEL Capabilities

I am not very sure whether i m posing this question in the correct forum.
But would like to know the following ,Can anyone please help on the following queries :-
1)Is the communication over IBM MQ and DB LINK communications possible using Oracle BPEL (or its adapters) ?
- If Yes Are they ready made or need any manual coding for this ?
2)Can Oracle BPEL handle medium complex computations (eg - creation of an xml message based on an xsd .. and nominal mathermatical computations like addition , division and String Manipulations ?)
3)How does paramter mapping take place in BPEL .. How efficient is it in the manipulation of these parameters .? 
1) Communication over MQ is possible using the MQ adapter. It is also possible to use the ESB MQ adapter. This adapter is ready to go basically you need to configure the oc4j-ra.xml file. Have a look at the following link on who to configure. Help setting up ESB and MQ
I'm unsure what you mean by DB link as this is more a database protocol. You can how ever use the DB adapter to call procedures, function, queries, etc. You can also perform updates and inserts.
2) The foundation of BPEL is based around xsd. You need to develop these yourself but JDeveloper has a GUI that helps in this development. You can definately perform mathermatical computations as well.
3) Not sure what you mean by parameter mapping, but BPEL has a number of activities that allows developers to map from one variable or schema to another by using assigns, or translations.
I suggest you look at the following link to become familiar with the product.
http://download.oracle.com/docs/cd/B31017_01/core.1013/b28937/toc.htm
Once completing this tutorial you will have a good understang of SOA suites capabilities.
cheers
James 
hey ... Thanks alot for your reply ...
The link seems to be of great help and i will go through it.
- By DB link i meant communication to legacy systems over dbLink.I guess you have correctly answered it .
- Though i still am not sure if we could perform complex operations since BPEL seems to be a structural language ...
- Another question was what do u rate it on performance front .
I know it is not easy to comment on this since many other factors are involved ..
But in general we can rate the performance of an application ...
I saw some posts which show performance tuning of this application but then can anyone please comment on the performance of this application ?
Can we specify how many Threads or Instances be created for a particular workflow at a time ? 
I don't there there will be a need to db links as BPEL can be used to connect to any database within your enterprise.
BPEL is used to orchistrate services. These services can be adapters, web services, etc. Therefore most of the complex operations will generally happen in within these services. BPEL does have the capability of manipulating variables with complex operations.
Performance does depend on what you are trying to acheive. BPEL processes everything in memory so if you are loading large files or processing large amounts of memory you will need to have the machine / architecture to handle this.
There are some configuration changes that can help increase the preformance. The projects I have been involved in have not been that big. Processing 2 - 3 messages a second. BPEL handles this with no issues. Really to get a decent answer on this you should advise an experienced SOA Suite expert that can provide you with an architecture that will suite your needs.
cheers
James 
Hi James ,
Thanks for ur quick and to the point replies .
All,
Performance is a major issue and constraint in the case where i will be using BPEL in my application.
I have an application which should be capable enough to process approxmiately 900 messages / min and around 150 Process / min .. (I understand that the answer cannot be specific since it is hardware dependent however atleast some BENCHMARKS would be Helpful)
All/James,
If the BPEL keeps all in the memory .. then what happens if my Process is waiting for an external event for say around 1 day .. Will all the data related to my process be existing in the memory for all this while ??????
Please need an answer on this since it is a critical part for consideration. 
I'm not aware of any benchmarks, best to contact Oracle for this information. Maybe there is someone out there that has the same sort of load.
If the process is an async process it dehydrates to the dehydration store while it waits. If it is sync then it will time out most probably untless you set some long time out parameters (not advisable)
cheers
James 
Hi,
Oracle does have benchmarks available. Based on real-life cases and specified in generic terms like: suppose you have a BPEL process that interacts with 5 PartnerLink services, exchanging messages of 20Kb, and some other constraints ... you will need a 1 CPU Intel box with the at least X Mb memory in order to process xx,000 processes per hour.
As James rightfully points out, there are a number of things to consider when designing a solution that needs to adhere to certain (performance) requirements; determining what data should be persisted (or dehydrated) is one aspect. And dehydrating process instances that wait for a response frees up the memory for executing process instances and is therefor an important 'performance enabler'.
Also, there are lots of capabilities (knobs, settings and dials) for tuning the performance of the BPEL engine. And in the process the engine helps by spitting out lots of performance statistics.
Hth,
Sjoerd 
THANKS for all the replies
I have another doubt.
BPEL maintain a SENSOR functionality for finding out milestones.
Is it possible for the final customer to view the status of his transaction (Milestones) without adding any additional interface ? 
Hi,
A Business Activity Monitoring (or BAM) tool should be able to consume sensor-data that is fired from a BPEL process. At least Oracle BAM is able to do that.
Also note here that the BPEL Console is not an end-user tool (you do not want business users to accidentally hit the 'Purge all instances' button ;-)).
I feel a word of warning is justified here: judging from the questions you are asking it seems to me you are new to this "SOA stuff". Carefully plan and manage the introduction of all this new technology, the organization should be able to deal with the complexity in a proper manner!
Hth,
Sjoerd

When to use ESB ?

Hi !
I have a SR at Support regarding the transfer of large volumes of data . ( 3M rows)
I'm getting a java.langOutOfMemoryError
Now support is saying this:
"DB to DB data movement with large volumes is a perfect use case for ODI, particularly if there is any data filtering / cleansing / en
richment desired ( now or known future requirement)
Perhaps you should investigate ODI (Oracle Data Integrator)"
Could someone from Oracle confirm that ESB is not the tool to use. And please give me a pointer on how to do instead. ?
Is there any papers on what tool to use in some different integration scenarios. ? 
The major difference between BPEL and ESB is the speed. ESB is fast than BPEL. But you could tweak BPEL also to get higher throughput. ESB does not have the overhead that BPEL is carrying. But it is not so rich on functionality, like error handling and creating a business flow as BPEL
Marc
http://orasoa.blogspot.com 
But the question here was not when to use BPEL or ESB .
Or are you saying that BPEL is the tool to use for moving data from one database to another?
If that is not the case then the question was when to use Data integrator and when to use ESB. 
ESB is better suited for large quantities of smaller messages than bulk data. There should be no problem processing data of a few MB's using ESB. Very large files can also be debatched and treated as multiple smaller messages. For loading bulk data ODI is a better choice than ESB or BPEL. 
Is it a onetime movement of 3m records or are you going to move those like every day/week ? 
You can also look to Oracle ODI? To do this, or create a 1:1 inegration based on Oracle technology; database links / import-export / materlized views.
Marc
http://orasoa.blogspot.com 
Hi,
Have you looked at Oracle Discoverer to do your ETL, or maybe even something like Advanced Replication? Think about it, SOA is not the right hammer for every solution.... ESB and BPEL both are message based solutions, hence implicitly good at delivering (small) messages in a heterogenous applications environment, but not the best performant manner to do ETL on large data volumes. In addition, ESB and BPEL are designed for integrating enterprise applications at the middle-tier level. Why should one opt doing a DB-to-DB integration via the middle-tier? "Good old" batch-wise ETL might actually perform better with high volumes. And its a proven technology.
Regards,
Harm 
I understand that there is a choice of tools, so if someone has a pointer to some kind of guidelines, please post it here.
BTW, our choice of using ESB is based on the fact that exactly this kind of transfer of data is working excellent in InterConnect and since SOA and ESB is supposed to be the product to replace InterConnect it seemed like the way to go. 
I don't have any specific arguments, nor metrics from my own experience.
But I can make the landscape of options even more complex, as you can expose functionality from ODI as web services and integrate that with ESB.
Bringing in ODI as an extra component in your architecture, means that you have to install, configure and maintain that too. You probably also have a learning curve involved with that also.
So intuitively I would say that my first option would be to find out if I can overcome the memory problem on a short notice. If that proves to be complex, involve some custom "tweaking" that might be difficult to migrate in the future or not even possible, I definitely would look into ODI. After all, ODI is specialized in this, therefore probably will be more efficient and performant and will offer more options.
--
Jan Kettenis 
This has been with Oracle Support since September and since they can't fix it. I must look for some other way . 
Sorry to hear that. In that case I would go for the option to implement the functionality using ODI, expose that as a web service and make it available through the EBS.
--
Jan Kettenis

Database Polling

Hi All,
I would like to poll certain databases with some standard query at say 2sec intervals or even real time. I would like to do the same for multiple databases and based on the query ultimately monitor metrics of certain qualifying records created, updated etc. Typically this would be intensive and could hamper performance of the application that is referencing the databases. I am sure many of you might have experienced this kind of scenario and I would appreciate if you could share the same.
I am of the understanding that the jdbc drivers are properly tuned and if we ensure that the queried columns tables are properly indexed, it would reduce any possible latencies. We do have Enterprise Manager's, Data Guard and we have considered copying them over to another database and derive metrics. We may ultimately go this route, but it does not solve the issue of real time monitoring. I would like to hear any alternative and unique approaches and appreciate your input.
Regards 
Hi
I'm responding to this with my BPM hat on (about the only hat I've got).
Tell us more about your use case if you get a minute. As much as it pains me to say this, but BPM might not be the best solution to what you're trying to do.
Just going on what you've posted here, polling a database at subsecond intervals or recognizing database changes realtime might be a better fit for another technology built specifically for this.
Others on this forum will probably have a better idea, but depending on what it is that you're trying to do you might want to look at CEP (Complex Event Processing) Server technology. This has the capability to collect feeds from hundreds of different sources (database feeds are supported) with zero latency (ok - very little latency).
Here's where BPM might come into play in this scenario. Once an event (e.g. some row on one of your tables changes) is caught, the CEP server could then push the information into an Oracle BPM process. A couple weeks ago, I used this technique to create a new work item instance in an Oracle BPM process and then used it to communicate with an existing work item instance (alerting the instance that the order items in the order had changed and how they changed) via the API into Oracle BPM.
Not trying to push CEP on you, but the second I saw "realtime" in your post it came to mind. Sure there are other options others will suggest.
Dan
Edited by: Daniel Atwood on Mar 19, 2009 10:33 AM 
You are absolutely right Daniel, we are exploring the possibility to use CEP as part of the BPM suite and access the oracle log using ODI. Have you used ODI before or tried out accessing the log. I am a little apprehensive , eventhough we are not hitting the database, shouldnt we be accessing the log after it closes to prevent lockouts.

Integration Approach - Recomndation

Hi All,
I am new to fusion middle ware and am listening many tools for many purposes.
I have read ODI can be used to integrate across heterogeneous databases.Also ODI is especially good for large volume data integrations
Also, something similar can be done using BPEL.
I just want to know, how to decide which tool for which sort of integration scenario. Is there some recommendation or checklist which can be used to identify the tool for integration scenario?
Regards,
G V Ranjith 
Hi,
As I see it, you can have many answers and explanation. It depends who you will ask.
I recommend first to ask some "small" questions: what is the need?
Do you handle Massive data transformation? Complex manipulation? Good monitoring (Users monitoring)? Working only/a lot with DB?
Your process is short/long run? Your process will run once a day or 1000 times in a min'?
Some of the possibilities in one integration tool can be found in another and the opposite.
Sure, there are high advantage for each tool, but first you need to know you need.
Riko

Categories

Resources