My name is Batch, Spring Batch

Very often Enterprise applications need to import or export a lot of data from or to external systems. These operations may be required with regular frequency; in order to implement them you can choose among several solutions provided from frameworks (Batch Framework), APIs (Quartz) and tools (ETL). One of the best solution is to apply a scheduled batch process that easily fits that kind of technical requirements. In this POST I’d like to provide an easy introduction to Spriong Batch; there are tons of how-tos and examples in iternet, but when I had the need to try a simple solution at home in a new workspace I’ve met some small issues; then my goal is to provide a quick overview of the main features and to show how to setup a Java project for an Hello World Spring Batch example.

What Spring Batch is?

Spring Batch is a framework that provides functions able to process large volumes of records, to manage transactions, to start job operations and to store their statistics. It is based on Spring Core then you can manage dependency injection and use all Spring libraries. It is built on a simple model that is shown here:batch-model

How you can see, the main concept is the job; a job is composed by one or more steps. It is executed through a JobLauncher and all data related to the execution are persisted in a JobRepository. In other words when you are going to design and implement a Spring Batch job, you have to figure out how many steps your job needs, which JobLauncher you have to use and where you want to put job execution data.

Enviroment setup

I assume that today every Java programmer has maven intalled and enabled in his programming system and/or that he has the plugin on his preferred IDE; then the first step to get Spring Batch ready is to run this command on your command line:

mvn archetype:generate -DartifactId=HelloSpringBatchWorld -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false

It will create a standard maven structure with a pom.xml file within the its root directory; at this point you can replace the its content with this:

<project xmlns="" xmlns:xsi=""

The enviroment isn’t still ready, it needs to load all artifacts related to Spring Batch in order to use its jar files; in order to do that you need to run this maven command:

mvn clean dependency:copy-dependencies

This command will load all jar files and allow to begin to code the example from your preferred IDE; I’m going to show how to do with Eclispe. Open an Eclipse IDE instance, and from an existing workspace “File”-> “Import …” -> “Maven” -> “Existing Maven Project”; now you have to navigate to the directory where before you’ve copied pom.xml file and complete import project phase.

Let’s code!

How you can see you have a main class named App under package. On few minutes we will use this class to run the Job, but something still misses; we need to edit two xml files. First one (app1-context.xml) is where we have to define the structure of the Job; in this case it’s very simple, we want just to show an Hello World message on the console.

<?xml version="1.0" encoding="UTF-8"?>
  <beans xmlns=""

    <import resource="classpath:/META-INF/simple-springbatch-conf.xml" />
    <batch:job id="HelloSpringBatchWorld">
       <batch:step id="HSBWStep">
         <batch:tasklet ref="HSBWTasklet"/>
    <bean id="HSBWTasklet" class="" />

Let’s focus on “JOB DECLARATION” section, it defines a batch job whose id is “HelloSpringBatchWorld”; it has just one step and the class where you can find its implementation is the bean having “id”=”HSBWTasklet”. Let’s take a view to this class


import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;

public class HelloSpringBatchWorldTasklet implements Tasklet{

	public RepeatStatus execute(StepContribution arg0, ChunkContext arg1)throws Exception {
		System.out.println("Hello SpringBatch World!!!");
		return RepeatStatus.FINISHED;


It is a basic Tasklet implementation; it has only a method and in our example it executes a simple print on System.out stream and return the “finished” status to spring batch. There is no matter to spend more time about this class; but remeber this is just an helloworld example; in many cases you will use this class structure in order to implement very complex logics. If we come back to app1-context.xml, we can see that it includes another file: simple-springbatch-conf.xml.

<?xml version="1.0" encoding="UTF-8"?>
  <beans xmlns=""

  <bean id="jobLauncher" class="">
    <property name="jobRepository" ref="jobRepository"/>
  <bean id="jobRepository" class="">
	  <bean class="org.springframework.batch.core.repository.dao.MapJobInstanceDao"/>
	<bean class="org.springframework.batch.core.repository.dao.MapJobExecutionDao" />
	<bean class="org.springframework.batch.core.repository.dao.MapStepExecutionDao"/>
	<bean class="org.springframework.batch.core.repository.dao.MapExecutionContextDao"/>

  <bean id="transactionManager"  class=""/>

This file is very important for non production enviroment, it overwrites spring batch default values for jobRepository, jobLauncher and transactionManager beans; these customized beans allow to use an in memory repository avoiding the use of a database. This feaure is very useful, because defines the structure where store metadata related to the execution of the job and of its related steps. Using an in memory approach will lighten the job but it will lose all data useful to trace the job execution.

Let’s run the Job!

We are at the end of our HelloWorld example; we have all the stuff ready to see our job working. Code below is the main we need to run it.


import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.context.ApplicationContext;

public class App{
	public static void main(String[] args) throws Exception {

		String[] springConfig = { "META-INF/app1-context.xml" };
		ApplicationContext context  = new ClassPathXmlApplicationContext(springConfig);
		JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher");
		Job job = (Job) context.getBean("HelloSpringBatchWorld");

		JobExecution execution =, new JobParameters());
		System.out.println("Job staus: " + execution.getStatus());


It is very easy, it creates an array of String in order to reference the configuration file where are defined all the beans; it creates a Spring context where it can retrieve a JobLauncher and the Job. Now it is ready to execute that Job with an empty JobParamers. That’s all!


How I said before, this is just a post where I want to show how easy is to build a software infrastructure able to execute a Spring Batch Job. You can start from this post and add more stuff in order to achieve your needs.

Posted in Java, Spring, Spring Batch | Leave a comment

Bye Bye 2014

Here I’m, you are finishing and I’m thinking how you have been and what I’ve done.
How you can see, during this year, I haven’t written more articles on my personal blog.
It is due to several reasons, main reason is that I’ve less time to spend for my personal “technical” life.
During this year I had several ideas to post and share on this blog; some belonging to the Openbravo world and other ones to the Spring Framework atmosphere. Before you go, I want to say to you that you were a very important year, few shadows and a lot of lights on the way. I’ve left former company (Extra) and so far I’ve left my Openbravo know-how there.

The past

I’m not working on that wonderful open source ERP platform. I’m afraid for these things, leaving Extra and not working on Openbravo, but someone better than me said “show must go on” and it is going on. I’m still following Openbravo evolution and every month I
spend 2-4 hours to upgrade it through mercurial and I’m still reading several basic items related to the Openbravo development and delivery enviroment.
I think that switching on a “quarter” release mode was a good idea, Companies having Openbravo on board will spend less time (money) to upgrade their systems to last release. Even the “all-in one” solution called Commerce Platform is a good idea in order to
support Enterprise retailer solutions.

The present

Dear 2014 It’s time to talk about my new experience; since March I’m working such as consultant at Alliance Healthcare (Napoli).
They/we are developing an enterprise program in order to support their IT offices, warehouses and pharmaceutical stores located in several countries.
I’m playing the Scrum Master role, “leading” (I know this word is forbidden in the SCRUM world) a team of 5 up to 7 people, during last 9 months I’m refreshing in my mind several tools and frameworks that I’ve disregarded (but not ignored) in the past years.
I’m using Maven, Jboss, Spring and Oracle every day. I’m focusing my few extra work time to study in depth Spring projects based on Spring Core; paying more attention for Spring Batch and Spring Integration.

I Promise …

I want leave you with at least a promise, I promise to find more time in order to write more blog posts and little side projects; I’d like (I already have several ideas …) to merge two development worlds that I love … may be I’m talking about OB and Spring, but I’m not so sure 🙂 🙂 🙂
bye bye my friend,

Posted in Miscellaneous | Leave a comment

Openbravo has dinner with Groovy

In the previous two posts (Openbravo meets Groovy and Openbravo has lunch with Groovy) we saw how to install Groovy Adapter module, we saw how you can execute Groovy code on Openbravo; we also saw how we can define a Java process, use Groovy in order to define its business logic, and become more “agile and responsive” to customer change requestes. I’d like to close this series of Groovy and Openbravo posts showing you how to use this module to write callouts and event handlers.

Ledies and gentlemen Groovy callouts!!!

Openbravo allows to developer to manage the event raised when the user fills a field and moves the focus to another field. This events may be managed through callouts; within application dictionary the developer can link a DB column to an already defined callout. A callout is a Java class that handles this kind of situations. Imagine you want to implement a callout that, when the user chooses a Product Category within the Product UI, fills the product description with a string and the actual date. In Java you can do that with this simple class:

public class JavaProductCallout extends SimpleCallout {
  protected void execute(CalloutInfo info) throws ServletException {
	Date d=new Date();
	info.addResult("inpdescription","Description updated "+d)


How I wrote in the last post, that’s standard Openbravo, but we can improve the response time to next requirements, about this feature, using Groovy. In order to reach this improvement we have to define a Java class like this:

public class SimpleGroovyScriptProductCallout extends BaseGroovyScriptCallout {
  protected String getGroovyScriptName() {
    return "GPC";

This class hasn’t business logic; it just points to a Groovy Script defined in the Grovvy Console UI and named “GPC”. Pay attention that all Groovy scripts linked to class that extends BaseGroovyScriptCallout have injected an object “info” instance of CalloutInfo. That’s why it is enough to define this Groovy script in order to implement the same business logic we defined with JavaProductCallout.

def date=new Date()
info.addResult("inpdescription","Description updated $date");

Groovy Event handlers? Why not?

“The business entity event allows you to implement business logic which reacts to specific events which are fired when entities are updated, deleted or inserted into the database” this how is defined an event handler on the Openbravo official documentation. This is the basic structure of a Java event handler:

public class ProductEventHandler extends EntityPersistenceEventObserver {
  private static Entity[] entities = { ModelProvider.getInstance().getEntity(Product.ENTITY_NAME) };
  protected Entity[] getObservedEntities() {
    return entities;

  public void onUpdate(@Observes EntityUpdateEvent event) {
    //manage update event

  public void onSave(@Observes EntityNewEvent event) {
    //manage save event

  public void onDelete(@Observes EntityDeleteEvent event) {
   //manage delete event

How you can see there are three methods that manage update, save and delete events; you have to put the business logic within these three methods; there is a method, getObservedEntities(), that returns an array of the entities whose this class wants to manage that events. In order to use Groovy language (using Groovy adapter module) it’s enough to define like this one:

public class SimpleGroovyScriptProductEH extends BaseGroovyScriptEventHandler {
  private Entity[] entities = { ModelProvider.getInstance().getEntity(Product.ENTITY_NAME) };
  protected void initChainOfResp() {
    this.respChain.put(EntityNewEvent.class, "UPDATE_SAVE_PRODUCT_HANDLER");
    this.respChain.put(EntityUpdateEvent.class, "UPDATE_SAVE_PRODUCT_HANDLER");
  protected Entity[] getObservedEntities() {
    return this.entities;

Let’s start from getObservedEntities(), it holds the same semantic; instead initChainOfResp() is a method that defines the mapping between the event and the name of the Groovy script that will be invoked when the event occurs. In this sample both New and Update events will be managed by UPDATE_SAVE_PRODUCT_HANDLER script.


This script has to be defined using the Groovy console UI.

def obj= event.targetInstance
def entity=obj.entity
event.setCurrentState(entity.getProperty("description"),"$name SimpleGroovyResourceScriptProductEH "+new Date())

It’s easy to understand what this code does; the only thing to keep in mind is that BaseGroovyScriptEventHandler class injects the object “event” and you can use it managing the linked event object.

What’s the next step?

Ok, it’s all for Openbravo and Groovy, they have met, they had lunch and at the end of the day they had dinner; I don’t know if I will add other stuff to this module. I need time to think about this thing, have you any suggestion? Have you found some bug? Please add a comment or a tweet and stay in touch!

Posted in Groovy, Java, Openbravo | Leave a comment

Openbravo has a lunch with Groovy

In the last post (Openbravo meets Groovy) I showed how to install the Groovy Adapter module and its main UI: Groovy Console; in order to use and access to this UI you have to use the System Administror role. That’s because its usage may be very dangerous, so it’s better to allow to use it to a responsable user. But if you are reading even this post, I think you are asking yourself: why should I use or add this module to my Openbravo instance?

On the fly updates

I’m working from 5 years on several Openbravo projects, very often, even in production enviroments, we face the issue regarding simple updates to be done on a set of data. In that case we have to access to the operating system where the instance has been deployed and, through plsql/sqlplus commands, we have to execute update or insert sql statements. Not very confortable solution I think. Another solution may be to open a tunnel connection with that server (if it is possible) and use our preferred SQL client (TOAD, PgAdmin or something else); more confortable but not still enough. For instance (consider it just a sample/poc), imagine your customer calls you asking to modify the price of all products that are using the Price List Version “Sales PL” whose Valid From Date is 10 Nov 2013. These prices have to be increased of 10% due to a new decision token by Marketing team. You can follow one of the two solutions we’ve described before, but it could be a very useful way to satisfy that request to access as System Administrator and run that update directly on a web page belonging to Openbravo suite. Ok, let’s try to do that, imagine you have to use Java code with no class definition:

import java.sql.PreparedStatement;
import java.sql.Date;
import org.openbravo.dal.service.OBDal;
String sql="UPDATE m_productprice set  pricelist = pricelist + pricelist * 0.1 where m_pricelist_version_id in ";
sql+=" (select plv.m_pricelist_version_id from m_pricelist_version plv where and plv.validfrom=?)"
PreparedStatement ps = OBDal.getInstance().getConnection().prepareStatement(sql);
ps.setString(1,"Sales PL");
ps.setDate(2,new Date(113,10,10));
int q=ps.executeUpdate();
return q;

Nothing that needs to be explained, the awesome thing is that it is a working groovy script. You can put this code into the Source field of the Groovy Console UI, push the Execute Groovy button and it will solve the customer request. Very nice! Isn’t it?
I want to show an approach more “groovy oriented”:

import org.openbravo.dal.service.OBDal
import groovy.sql.Sql
def sql="""
UPDATE m_productprice set  pricelist = pricelist + pricelist * 0.1 where m_pricelist_version_id in
(select plv.m_pricelist_version_id from m_pricelist_version plv where and plv.validfrom=?)
def sqlc=new Sql(OBDal.getInstance().getConnection())
def res=sqlc.executeUpdate(sql, ['Sales PL',new java.sql.Date(113,10,10)])

I think it is at step ahead, it shows its “not typed” nature; it shows the utility of the “”” String delimiter and it shows the useful groovy.sql.Sql Class.

Make background Processes more agile

Imagine you have to write a background java process that has to decrease of 5%, each day, the price of all the products belonging to the price list version named “Price List V 2013_1”. You need a simple Java class with few Java code, something like this should be enough:

//import statements
public class PriceCorrectionProcess extends DalBaseProcess {

 protected void doExecute(ProcessBundle bundle) throws Exception {
 //implement the biz logic using java and DAL

You have to create a new Java Process using the Report and Process UI, after that you have to add a Process Request linked to that Process, and schedule “daily” it. That’s simply Openbravo, but what does it happen if your customer begins to use this process and after some days he calls to you and says: “Sorry, we noticed that the price is going down too much fast please use a 3% rate”? The answer?
You have to assign this task to a developer (or assign it to your self) and he has to replace a 5 with a 3 in the PriceCorrectionProcess java code. He has to compile (“ant smartbuild”) the code and build a new obx file. After that you have to call your customer, you have to get an acknowledge about the date-time you can deploy the updated module. All this stuff can reach 4-8 hours, not so bad but you can do it better and faster and your customer will appreciate it. From the 0.4 version of Groovy Adapter module it is possible to define a DalProcess that runs groovy script. Take a look to this class:

public class SimpleGroovyScriptPriceProcess extends BaseGroovyScriptProcess {
  protected String getGroovyScriptName() {
    return "Price_Groovy_Process";

It is very easy, you have just to define the name of the script that it has to use in order implement the customer requirement.

Thus all the business logic of your process can be defined in a groovy script like this:

import org.openbravo.dal.service.OBDal
import groovy.sql.Sql
def sql="""
UPDATE m_productprice set  pricelist = pricelist - pricelist * 0.05 where m_pricelist_version_id in
(select plv.m_pricelist_version_id from m_pricelist_version plv where
def sqlc=new Sql(OBDal.getInstance().getConnection())
def res=sqlc.executeUpdate(sql, ['Price List V 2013_1'])

If you use this approach, when the customer will call to you asking to decrease the rate from 5% to 3% you can do it in 2 minutes!!! You need only to access to the production enviroment, switch to System Administration role, go to the Groovy Console UI, select the row whose name is “Price_Groovy_Process” and change 0.05 with 0.03 in the sql String variable. That’s it!!!
However, how I’m repeating more and more times, be careful doing these kind of operations, they may be very dangerous, that data is going to be changed on the DB. May be that Openbravo developers are thinking: ok this is a case where I don’t need to read a ProcessBundle object and I don’t need to write on a OBError object; how can I use it if I need to “use” that two objects?
The answer is that each groovy script that is invoked thorugh a BaseGroovyScriptProcess class has two “binded” objects that are bundle and msg; the first one is an object instance of ProcessBundle the second one is an OBError where you can write your results.

Waiting for next step

Ok, it’s enough for this post, I don’t like to noise you with verbose posts; we’ve seen two simple samples about how the Groovy Adapter module can improve Openbravo agility. At this time I’m publishing the 0.0.4 version on the forge. It allows to develop Java Process putting the business logic in a groovy script; it is also able to manage callout using a similar approach. The Callout usage will be described in the next post, stay tuned!

Posted in Groovy, Openbravo | Tagged , , , | 4 Comments

Openbravo meets Groovy

The goal of this post is to show how it is easy to integrate Openbravo ERP and Groovy. Openbravo ERP is the world leader of open source ERPs. It is Java based and it is built on several standards such as Java Servlet, XML, JSON and Hibernate. Groovy is a programming languge belonging to the world of JVM languages. It can be compiled in order to share its byte code with other JVM languages but it can be also used such as a scripting language; in other words you can use it with no compilation steps, just loading a piece of code into a Java class and execute it. If you are a Java developer you can begin using Java code and almost in the 100% of cases your Groovy code will work. Now it’s time to begin, let’s go to use this Openbravo module.

Installing Groovy Adapter module

There are two ways in order to install the Groovy Adapter module: downloading the obx file from the Openbravo Forge site or installing it using the Module Management menu of Openbravo ERP with System administrator role. In the first case you have to point your browser to Groovy Adapter module and download the last version of the module (now at 0.0.3).


Once you have downloaded the obx file, you can access to Openbravo with System Administrator role and choose the Module Management menu item.


Select the the Add Module tab and through the Browse File System button load the obx file you had downloaded from the openbravo forge site.
The second way is more easy you need just to search the module on the Add Module tab of Module Management Window, select the Groovy Adapter module and finish the installation process.


Groovy Console

If you have installed the module, after restarting the Openbravo ERP you can access to the ERP with System Administrator role then you can begin to play with Groovy. Choose the Groovy console window and it will show this UI


How you can see there is the Source text area where you can write and store the Groovy code, on the top right side there is an Excute Groovy button. If you push on it the Groovy code will be executed and the output will be shown on the Output text area. If something goes wrong an error will apeear in the status bar message box, instead if all things work fine the computation time will be shown. The name field is mandatory and must be unique, at the moment we don’t talk about how we can use this field but we’ll do for sure in the next post.

First conclusions and next steps

Ok, I hope this post is enough for the moment, through this reading you are able to load the module and to begin to play with simple Groovy scripts. Pay attention to its usage, this module can be used ONLY with System Administrator role because it may be very dangerous, if you delete something through a script, that data will be losed forever. I’ve in mind to write other posts and to improve the module allowing its usage to cover several Openbravo development patterns (Java Process, callouts and so on …).

Posted in Groovy, Openbravo | Tagged , , , | 3 Comments

Back to the Mule #1

Three years ago I used Mule as Enterprise Service Bus in a Java project; its goal was to  enable interoperability among different IT worlds. A month ago I had to do in a new project a thing very close to that I did 3 years before. Panic for me! I have just 2k of RAM in my mind 🙂 I forgot how Mule works, but I forgot how easy is to use it. I had to search as many samples as I can because I’m the architect and I had to show to a customer that his WEB Portal (built on Servlet + JSP and Hibernate) can send and receive data to/from  his AS400 ERP. The acknowledge was to publish Web Service through a BUS and avoid to use JDBC query directly within the portal. So the goal was to publish Web Services through Mule and add to Mule several components able to read and write  to a DB2 server with standard JDBC statements. Tools I used for this Technical POC (proof of concepts) are Eclipse (Helios), Mule 2.0.2 and a Postgres 8.3 DB on a Ubuntu 10.10 distro; I know I talked before about DB2, but I did miss a detail: I can’t use customer’s enviroment due to security policies.

Installing Mule

After I downloaded the 2.0.2 Mule version I unzipped its content and I added

export MULE_HOME=/home/roby/software/mule-2.0.2

at the end of my .bashrc file.

Server side programming

After that I need to implement a Web service in order to publish a service callable from the Portal.

public class ADUserDTO implements Serializable{
	private static final long serialVersionUID = 1L;
	protected String userId=null;
	protected String clientId=null;
	protected String orgId=null;
	protected Date birthDay=null;
	protected String firstname=null;
	protected String lastname=null;

	public String getUserId() {
		return userId;
	public void setUserId(String userId) {
		this.userId = userId;
 //... other getters and setters

This DTO will be managed by a simple interface:

. . .
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebResult;
import javax.jws.WebService;
public interface ADUserService {
	public @WebResult List<ADUserDTO> listAll(@WebParam String clientId);
	public @WebResult ADUserDTO read(@WebParam String userId);
	public @WebResult String  create(@WebParam ADUserDTO user);

This interface has to be “annotated” with standard Java JWS annotation. I think that there is nothing to be explained in this code. But it needs a concrete implementation class; so let’s see how it can be.

public class ADUserServiceImpl implements ADUserService {
	protected ADUserDAO dao=null;
	public List<ADUserDTO> listAll(String clientId) {
		List<ADUserDTO> res=null;
		return res;
	public ADUserDTO read(String userId) {
		ADUserDTO dto=null;
		return dto;
	public String create(ADUserDTO user) {
		String id=null;
		return id;
	public ADUserDAO getDao() {
		return dao;
	public void setDao(ADUserDAO dao) {
		this.dao = dao;

Leaving details about the DAO class we can assume that the Web Service is almost ready to be published. Thus, next step is to pack all those classes in a Jar. With Eclipse I use to right-click on my project and select Export -> Export … and to choose the Jar file under Java.
Imagine that it works fine, then we have a MyWSMuleProject.jar file in our home directory.

Publishing the Web Service

In order to publish the web service we miss three steps:
1) edit an xml config file under MULE_HOME/conf folder
2) move the MyWSMuleProject.jar file under the MULE_HOME/lib/usr folder
3) run the Mule Server.

Assuming that 2) doesn’t need particular explanations, let’s put our attention on the first step;
we have to define a Mule configuration file. This one below is what I wrote in the wsconfig.xml.

<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns=""

    <spring:bean id="userService" scope="prototype" class="">
        <spring:property name="dao">
          <spring:ref local="userDAO"/>
    <spring:bean id="userDAO" class="it.newopenlab.pocspringcompletews.dao.ADUserDAO">
    	<spring:property name="username" value="tad"/>
	<spring:property name="password" value="tad"/>
	<spring:property name="dbUrl" value="jdbc:postgresql://localhost:5432/ob"/>
	<spring:property name="driverClass" value="org.postgresql.Driver"/>

    <!-- define an cxf endpoint -->
    <cxf:endpoint name="cxf" address="http://localhost:9080/mule/pocspringcompletews"
                  frontend="jaxws" synchronous="true"/>
    <model name="cfx-simple">
        <service name="world">
               <cxf:inbound-endpoint ref="cxf"/>
		<spring-object bean="userService" />


There are three distinct sections, the first one is that about the xml schemas we are going to use.
The second section defines Spring DI about the classes we implemented.
The last one is where we publish a CXF web service that exposes the methods annotated in the “userService”
Spring bean.
Ok, we are ready to publish it, at this point it is enough to run:

$MULE_HOME>./mule -config ../conf/wsconfig.xml

In order to check if all is working fine use http://localhost:9080/mule/pocspringcompletews?wsdl
url in a browser. If you get the wsdl of our web service it is OK!

Posted in Java | 2 Comments