Saturday, December 4, 2010

Unit Testing CXF Web Services

Here is a quick little unit test setup to test your CXF service class and your client class. This is nice if you are testing marshaling issues.

@Test
  public void testMyWebService() {
    MyWebServiceImpl endpoint = new MyWebServiceImpl ();
    JaxWsServerFactoryBean svrFactory = new JaxWsServerFactoryBean();
    svrFactory.setServiceClass(MyWebServiceInterface.class);
    svrFactory.setAddress("http://localhost:9000/myService");
    svrFactory.setServiceBean(endpoint);
    svrFactory.create();

    JaxWsProxyFactoryBean factory = new JaxWsProxyFactoryBean();
    factory.setServiceClass(MyWebServiceInterface.class);
    factory.setAddress("http://localhost:9000/myService");
    MyWebServiceInterface client = (MyWebServiceInterface) factory.create();

     List details = client.performSomeWebServiceAction("11");
    Assert.assertNotNull(details);
  }


That's it! This will bootstrap an actual JAXWS version of your service within your JVM and hit it with a real CXF client. One thing you have to make sure of, if you are running tests on a CI server be sure to use a port which is not in use on that build box.

Friday, December 3, 2010

Fixing Eclipse JDK Issue

If you have ever had the issue of eclipse complaining it was not running in a jdk here is how to fix it. Open the eclipse.ini file at the root of the eclipse installation. Add this line to the top of the file, change it to point to your jdk...

-vm
C:\Program Files\Java\jdk1.6.0_11\jre\bin\javaw

Restart Eclipse and you are done!

Thursday, December 2, 2010

CXF Interface Extending Another Interface

Just a fun little finding for the day. In the newer versions of CXF you cannot have an interface you define as a WebService extend another interface. This tends to throw a SOAP Fault Exception "Unexpected Wrapper Element". The solution for this is to create another service for the interface you are extending or just merge all the methods into one interface.

Wednesday, December 1, 2010

Spring JDBC Template Tuning

Had a fun little issue the other day with spring's JDBC template. I was querying a fairly large data set, about 65 million records, and realized the job was running slower than molasses going uphill. After debugging the code I found the query was returning very fast, so the database was not the problem. The issue ended up being JDBC template itself, namely the fetch size. If you do not configure the fetch size explicitly the template will use the driver default, which for Oracle is 10, the database I was working with.

The rows returned from my test query was about 53,000, so with a fetch size of 10 this would take  5,300 network trips to retrieve all the data and store it in memory. Before tuning this operation took about 2.5 minutes to run.

One thing to keep in mind when tuning the fetch size is memory consumption. You should be sure you will have enough memory available to store the result set. For my example I set the fetch size to 10,000 as my result sets were always very large. I was also working with a batch job which ran during off-peak hours, so I knew my available memory would be plentiful.


After setting the fetch size the operation took 15 seconds to run, way better than before. Nice!

Sample code:
On a dao class extending JdbcDaoSupport  I stuck this in an overridden initDao() method.
simpleJdbcTemplate.setFetchSize(1000);

Here is another good post about JDBC fetch size: http://webmoli.com/2009/02/01/jdbc-performance-tuning-with-optimal-fetch-size/

Dust off the old blog

Been a couple months since I have added anything to this blog, I was too busy with work. I have a new job now that is structured a little different, I will have some time to add to this every once in a while, here it goes...

Friday, March 26, 2010

Streaming large LOBs from MySQL to flash clients blew up HARD!

Had an interesting run in the other day with a fundamental feature in one of my web applications. What is supposed to happen is a user logs into a flash client app capable of streaming video and playing back an swf version of a powerpoint which is streamed from a database. The flash client defines an movie loader object which pulls a byte stream from a spring MVC controller which in turn pulls these bytes from MySQL using a spring DefaultLobHandler implementation. This is where I think things went horribly wrong, here is the issue:



During a web cast we had many people logging on at one time, about 100. As stated earlier when a user successfully logs in they pull an swf file from the database and load that into their client app. Usually this works perfectly fine with small files, < 1Mb in size, and has even worked with a much larger audience. On this occasion we had a 10Mb swf file to stream from the database, all hell broke loose. Once many people began signing on and downloading this file we began receiving MANY Broken Pipe exceptions from this controller. Right now it is not clear if the database was breaking the connection, the DBCP 1.2.2 connection pool was breaking the connection, the spring MVC controller was breaking it OR the flash client apps were breaking it. I am leaning towards either the database or the connection pool as the culprit here. One last clue was tomcat totally hung when too many of these errors occurred, it refused to serve any more pages. Even after starting and stopping the tomcat service in Fedora it refused to work, I had to hard reboot the servers.


This continued to occur after hard rebooting the servers, I still received Broken Pipe exceptions and Tomcat continued to hang.

I guess the next step is to try and recreate the situation. I am going to take my Uber jar approach and build out a JUnit test which spawns off about 30 threads each calling the download ppt method from the web app. Using JUnit Perf I am going to make this available from the command line, install it on several servers on Amazon Cloud EC2, then nail the main web app with this test and simulate around 120 users downloading a massive powerpoint file at the same time.

Possible Solutions:

If it is found that streaming a large BLOB directly from the database to many clients will never work I have come up with a few different solutions.

  1. Continue storing the BLOB in the database AND add a filePath field to the database which points to a location on disk where this file is also stored. This would give me the confidence that I always have a copy of the file sitting in the database in case I need it. This is nice if for some reason the file on disk is deleted I can always restore it.  NOW, when a user requests this ppt file the system will first check to see if the file is available on disk, if it is just stream it to the users using regular file io.  If the file is NOT on disk, syncronize the call, then stream the file from the database to disk, then stream to the client from the disk. 
  2. Pretty much the same solution as option 1, but instead of streaming the file from disk store it in Amazon S3 storage.

Tuesday, March 9, 2010

SimpleJdbcInsert Sometimes Fails with Java 1.6.0_18

This one is pretty awesome, I am getting an InvalidParameter exception for the SimpleJdbcInsert.executeAndReturnKey(). This only showed up when upgrading from Java 1.6.0_17 to 1.6.0_18. And the extra fun part is this does not occur on all calls to this method, only in isolated cases. It is totally awesome and I have no idea what is causing it....yet.

REVISION: 
Actually this was due to the fact I was passing a poorly formatted date string to the SimpleJdbcInsert object. I guess jdk1.6.0_18 is pickier than it's predecessor.