Apache CXF is an open source services framework which is a result of the merge between the XFire and Celtix projects. CXF helps us build and develop services using JAX-WS. These services can speak a variety of protocols such as SOAP, XML/HTTP, RESTful HTTP, or CORBA and work over a variety of transports such as HTTP, JMS or JBI.
A java blog with a collection of examples and tutorials on Java and related technologies. (Under maintenance with continuous updates) Be in touch with java jazzle or k2java.blogspot.com.
Thursday, June 30, 2011
Introduction to Apache CXF
Books for CXF web service
Apache CXF web service Development
Do you use Apache CXF? If so, you might take an interest in "Apache CXF Web Service Development".
The book does a good job of covering CXF use cases, going beyond the usual trivial Jax-WS examples. It also covers Jax-RS (RESTful) web services, and covers each in enough detail that you're likely to find what you need when working with CXF.
Jax-WS has largely demystified basic web service development, so there's a great amount of content on the web that will show you how to quickly annotate a POJO to get a web service up and running. But what if you need to do contract-first (top down) development? Lightweight resources often conveniently bypass this more difficult trail, but this book does a good job of handling it. (This is no great accomplishment for a book on web service development, but it does set the tone for the types of things this book will show.)
It also covers restful web services. I'd say a Java developer either currently using or wanting to use Apache CXF. The book isn't a complete reference for CXF, but it does introduce all the important topics. Once introduced, there's enough content to either solve your problem or at least educate you enough to effectively research what remains.
This book can be found here.
Do you use Apache CXF? If so, you might take an interest in "Apache CXF Web Service Development".
The book does a good job of covering CXF use cases, going beyond the usual trivial Jax-WS examples. It also covers Jax-RS (RESTful) web services, and covers each in enough detail that you're likely to find what you need when working with CXF.
Jax-WS has largely demystified basic web service development, so there's a great amount of content on the web that will show you how to quickly annotate a POJO to get a web service up and running. But what if you need to do contract-first (top down) development? Lightweight resources often conveniently bypass this more difficult trail, but this book does a good job of handling it. (This is no great accomplishment for a book on web service development, but it does set the tone for the types of things this book will show.)
It also covers restful web services. I'd say a Java developer either currently using or wanting to use Apache CXF. The book isn't a complete reference for CXF, but it does introduce all the important topics. Once introduced, there's enough content to either solve your problem or at least educate you enough to effectively research what remains.
This book can be found here.
Types of developmental styles for Web services
There are two development styles: Contract Last OR Code first and Contract First.
Code First approach or Bottum up approach
When using a contract-last approach, you start with the Java code, and let the Web service contract (WSDL, see sidebar) be generated from that.
Contract First approach OR Top Down approach
When using contract-first, you start with the WSDL contract, and use Java to implement said contract. Start with a WSDL contract and generate Java class to implement the service.
The Contract-First wsdl approach requires a good undertanding of WSDL and XSD (XML Schema Definition) for defining message formats. It’s a good idea to start with Code-First if you are fairly new to web services. Later you will look at how to start web service development using the Contract-First approach.
Code First approach or Bottum up approach
When using a contract-last approach, you start with the Java code, and let the Web service contract (WSDL, see sidebar) be generated from that.
Contract First approach OR Top Down approach
When using contract-first, you start with the WSDL contract, and use Java to implement said contract. Start with a WSDL contract and generate Java class to implement the service.
The Contract-First wsdl approach requires a good undertanding of WSDL and XSD (XML Schema Definition) for defining message formats. It’s a good idea to start with Code-First if you are fairly new to web services. Later you will look at how to start web service development using the Contract-First approach.
Tomcat - JSP Precompilation
JSP is usually compiled during runtime by Java server. Some disadvantages are
1. If you JSP page is large, it will take time to compile at runtime. If that is the first hit, the user will have to wait before the page is served to the user. This is a performance bottlenect.
2. Although the current IDE, such as Eclipse, provide JSP syntax check, you may still run into the situation whereby runtime error occurs due to JSP runtime compilation issues.
To overcome these 2 issues, you can precompile your JSP page before putting them into the server. Tomcat come with JSP precompilation tools for your usage.
The following steps provide information on how to perform JSP precompilation
1. Make sure you have your Tomcat server installed. Or, you need the jar files from tomcat/bin and tomcat/lib, as well as tomcat/bin/catalina-tasks.xml. catalina-tasks.xml is a helper file for loading Catalina ant task for you, ie, jasper task.
2. Make sure you have Apache Ant installed
3. Add the following build script. This build script assume that your jsp source is at your web container.
<project name "Webapp Precompilation" default = "all" basedir= "."> <import file = "${tomcat.home }/bin/catalina-tasks.xml" /> <target name= " jspc"> <jasper validateXml = " false" uriroot= "${webapp.path}" webXmlFragment= "${webapp path }/WEB-INF/ generat ed we b xml " outputDir= " $ {webapp.path}/WEB-INF/src" /> </target> <target name= "compile"> <mkdir dir= " ${webapp.path}/WEB-INF/classes" /> <mkdir dir= " ${webapp.path }/WEB-INF/lib" /> <javac destdir= " ${webapp.path}/WEB-INF/classes" optimize="off" debug="on" failonerror= "false" srcdir= " $ {webapp.path}/WEB-INF/src" excludes= " ** /*.smap"> <classpath> <pathelement location= "${webapp.path}/WEB-INF/classes"/> <fileset dir= "${webapp.path}/WEB-INF/lib"> <include name= " .jar" /> </fileset> <pathelement location= "${tomcat.home}/lib" /> <fileset dir= "${tomcat.home}/lib"> <include name= "*.jar" /> </fileset> <fileset dir= "${tomcat.home}/bin"> <include name= "*.jar" /> </fileset> </classpath> <include name= " ** " /> <exclude name= "tags/**" /> </javac> </target > <target name= "all " depends = "jspc,compile"> </target> <target name= "cleanup"> <delete> <fileset dir= "${webapp.path}/WEB-INF/src" /> <fileset dir= "${webapp.path}/WEB-INF/classes/org/apache/jsp"/> </delete> </target > </project>
4. run the script with ant -Dtomcat.home="your tomcat server install home" -Dwebapp.path="your jsp source path". As you can see, tomcat.home is used to locate catalina-tasks.xml and webapp.path is used to locate your libraries and jsp source code. By changing these variables accordingly, you can customize your build path.
5. By default, it will compile your jsp into class file and put them at your "webapp.path"/WEB_INF/classes
Now, how to use these JSP classes file. 2 ways
1. Locate "webapp.path"/WEB_INF/generated_web.xml. Copy these contents into your web.xml
2. Copy all files at "webapp.path"/WEB_INF/classes into your tomcat server work folder. The common path is "your_tomcat_home"/work/Catalina/localhost/_/
You may want to ask why are we not using Ant JSPC task. That task is deprecated due to known problem in Tomcat 1.5 and it won't be fix by Apache Ant as well.
To date, there are 2 known issues for JSP precompilation. below is the abstract from Tomcat
As described in bug 39089, a known JVM issue, bug 6294277, may cause a java.lang.InternalError: name is too long to represent exception when compiling very large JSPs. If this is observed then it may be worked around by using one of the following:
reduce the size of the JSP
disable SMAP generation and JSR-045 support by setting suppressSmap to true.
Image Compression using Java JAI API
This post explains how to do the gif, jpg, bmp, png image compression using JAI APIs. We are not using image compression software but using open source Java Api’s to achieve compression.
The preferred method for reading an image file of any format into a RenderedImage is:
String filename = “// path and name of the file to be read,
that is on an accessible filesystem //”;
RenderedImage image = JAI.create(“fileload”, filename);
or:
URL url = “// URL of the remote image to be read //”;
RenderedImage image = JAI.create(“url”, url);
and for writing a RenderedImage to an image file in a format whose encoder is supported by an ancillary codec, using the default encoding algorithm, is:
RenderedImage image = “// the image to be stored //”;
String filename = “// path and name of the file to be writen //”;
String format = “// the format of the file //”;
RenderedOp op = JAI.create(“filestore”, image,filename, format);
The Java Image I/O API
Due to the many requests for a comprehensive image I/O package the Java Image I/O API was developed. The Java Image I/O API is part of the JavaTM 2 Platform, Standard Edition, version 1.4 (J2SE1.4).
The Future of Image I/O in JAI
A package set called JAI-Image I/O Tools has been released and is available via the JAI home page. The package set includes image reader and writer plug-ins for the Java Image I/O API for numerous formats, image streams which use the Java New I/O API, and JAI operations for reading and writing images using the Java Image I/O API.
In a future JAI release, the image I/O-related operators in JAI-Image I/O Tools will be propagated to JAI. It has not been definitively determined as yet, but it is likely that when the new I/O operators have been added to JAI the old operations will be deprecated.
The classes currently in the com.sun.media.jai.codec and com.sun.media.jai.codecimpl packages will most likely be removed concurrent with a JAI release subsequent to that in which the Java Image I/O API-based operators become available. However, Sun is making publicly available the source code of the com.sun.media.jai.codec and com.sun.media.jai.codecimpl classes so that developers who have written code based on them will still be able to use them. Please note that no technical support may be provided for these classes once they have been superseded by the Java Image I/O API.Following code does the compression of the image.
Here is the full code listing:
The preferred method for reading an image file of any format into a RenderedImage is:
String filename = “// path and name of the file to be read,
that is on an accessible filesystem //”;
RenderedImage image = JAI.create(“fileload”, filename);
or:
URL url = “// URL of the remote image to be read //”;
RenderedImage image = JAI.create(“url”, url);
and for writing a RenderedImage to an image file in a format whose encoder is supported by an ancillary codec, using the default encoding algorithm, is:
RenderedImage image = “// the image to be stored //”;
String filename = “// path and name of the file to be writen //”;
String format = “// the format of the file //”;
RenderedOp op = JAI.create(“filestore”, image,filename, format);
The Java Image I/O API
Due to the many requests for a comprehensive image I/O package the Java Image I/O API was developed. The Java Image I/O API is part of the JavaTM 2 Platform, Standard Edition, version 1.4 (J2SE1.4).
The Future of Image I/O in JAI
A package set called JAI-Image I/O Tools has been released and is available via the JAI home page. The package set includes image reader and writer plug-ins for the Java Image I/O API for numerous formats, image streams which use the Java New I/O API, and JAI operations for reading and writing images using the Java Image I/O API.
In a future JAI release, the image I/O-related operators in JAI-Image I/O Tools will be propagated to JAI. It has not been definitively determined as yet, but it is likely that when the new I/O operators have been added to JAI the old operations will be deprecated.
The classes currently in the com.sun.media.jai.codec and com.sun.media.jai.codecimpl packages will most likely be removed concurrent with a JAI release subsequent to that in which the Java Image I/O API-based operators become available. However, Sun is making publicly available the source code of the com.sun.media.jai.codec and com.sun.media.jai.codecimpl classes so that developers who have written code based on them will still be able to use them. Please note that no technical support may be provided for these classes once they have been superseded by the Java Image I/O API.Following code does the compression of the image.
Here is the full code listing:
import java.awt.Graphics2D; import java.awt.Image; import java.awt.image.BufferedImage; import java.awt.image.ColorModel; import java.awt.image.RenderedImage; import java.awt.image.WritableRaster; import java.awt.image.renderable.ParameterBlock; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.io.InputStream; import javax.imageio.IIOImage; import javax.imageio.ImageIO; import javax.imageio.ImageWriteParam; import javax.imageio.ImageWriter; import javax.imageio.stream.FileImageOutputStream; import javax.media.jai.JAI; import javax.media.jai.RenderedOp; import javax.media.jai.operator.FileLoadDescriptor; import org.apache.commons.transaction.util.FileHelper; import com.sun.media.jai.codec.SeekableStream; // Function to do the compression // jpg, gif, bmp, png file formats are accepted for the formatting. private void compressFile(String realPath, File in, String fileName) { Image img; BufferedImage input = null; try { if (fileName.endsWith(".jpg") || fileName.endsWith(".JPG")) { RenderedImage img1 = (RenderedImage) JAI.create("fileload", in.getAbsolutePath()); input = getBufferedImage(fromRenderedToBuffered(img1)); } else if (fileName.endsWith(".gif") || fileName.endsWith(".GIF")) { RenderedOp img1 = FileLoadDescriptor.create(in .getAbsolutePath(), null, null, null); input = getBufferedImage(img1.getAsBufferedImage()); } else if (fileName.endsWith(".bmp") || fileName.endsWith(".BMP")) { // Wrap the InputStream in a SeekableStream. InputStream is; try { is = new FileInputStream(in); SeekableStream s = SeekableStream.wrapInputStream(is, false); // Create the ParameterBlock and add the SeekableStream to it. ParameterBlock pb = new ParameterBlock(); pb.add(s); // Perform the BMP operation RenderedOp img1 = JAI.create("BMP", pb); input = getBufferedImage(img1.getAsBufferedImage()); is.close(); } catch (FileNotFoundException e) { e.printStackTrace(); } } else if (fileName.endsWith(".png") || fileName.endsWith(".PNG")) { // Wrap the InputStream in a SeekableStream. InputStream is; try { is = new FileInputStream(in); SeekableStream s = SeekableStream.wrapInputStream(is, false); // Create the ParameterBlock and add the SeekableStream to it. ParameterBlock pb = new ParameterBlock(); pb.add(s); // Perform the PNG operation RenderedOp img1 = JAI.create("PNG", pb); input = getBufferedImage(img1.getAsBufferedImage()); is.close(); } catch (FileNotFoundException e) { e.printStackTrace(); } } if (input == null) return; // Get Writer and set compression Iterator iter = ImageIO.getImageWritersByFormatName("jpg"); if (iter.hasNext()) { ImageWriter writer = (ImageWriter) iter.next(); ImageWriteParam iwp = writer.getDefaultWriteParam(); ` iwp.setCompressionMode(ImageWriteParam.MODE_EXPLICIT); float values[] = iwp.getCompressionQualityValues(); iwp.setCompressionQuality(values[2]); String newName = realPath + "/" + "Compress" + getFileName(fileName); File outFile = new File(newName); FileImageOutputStream output; output = new FileImageOutputStream(outFile); writer.setOutput(output); IIOImage image = new IIOImage(input, null, null); System.out.println( "Writing " + values[2] + "%"); writer.write(null, image, iwp); input.flush(); output.flush(); output.close(); writer.dispose(); writer = null; outFile = null; image = null; input = null; output = null; } } catch (FileNotFoundException finfExcp) { System.out.println(finfExcp); } catch (IOException ioExcp) { System.out.println(ioExcp); } } private BufferedImage getBufferedImage(Image img) { // if the image is already a BufferedImage, cast and return it // if ((img instanceof BufferedImage)) { // return (BufferedImage) img; // } // otherwise, create a new BufferedImage and draw the original // image on it int w = img.getWidth(null); int h = img.getHeight(null); int thumbWidth = 330; int thumbHeight = 250; // if width is less than 330 keep the width as it is. if (w < thumbWidth) thumbWidth = w; // if height is less than 250 keep the height as it is. if (h < thumbHeight) thumbHeight = h; //if less than 330*250 then do not compress if (w > 330 || h > 250) { double imageRatio = (double) w / (double) h; double thumbRatio = (double) thumbWidth / (double) thumbWidth; if (thumbRatio < imageRatio) { thumbHeight = (int) (thumbWidth / imageRatio); } else { thumbWidth = (int) (thumbHeight * imageRatio); } } // draw original image to thumbnail image object and // scale it to the new size on-the-fly BufferedImage bi = new BufferedImage(thumbWidth, thumbHeight, BufferedImage.TYPE_INT_RGB); Graphics2D g2d = bi.createGraphics(); g2d.drawImage(img, 0, 0, thumbWidth, thumbHeight, null); g2d.dispose(); return bi; } public static BufferedImage fromRenderedToBuffered(RenderedImage img) { if (img instanceof BufferedImage) { return (BufferedImage) img; } ColorModel cm = img.getColorModel(); int w = img.getWidth(); int h = img.getHeight(); WritableRaster raster = cm.createCompatibleWritableRaster(w,h); boolean isAlphaPremultiplied = cm.isAlphaPremultiplied(); Hashtable props = new Hashtable(); String [] keys = img.getPropertyNames(); if (keys != null) { for (int i = 0 ; i < keys.length ; i++) { props.put(keys[i], img.getProperty(keys[i])); } } BufferedImage ret = new BufferedImage(cm, raster, isAlphaPremultiplied, props); img.copyData(raster); cm = null; return ret; } /** * @param fileName * @return */ private String getFileName(String fileName) { String filName = fileName; if(!filName.endsWith(".jpg")) { if (filName.endsWith(".bmp")) { filName = filName.replaceAll(".bmp", ".jpg"); } if (filName.endsWith(".jpeg")) { filName = filName.replaceAll(".jpeg", ".jpg"); } if (filName.endsWith(".png")) { filName = filName.replaceAll(".png", ".jpg"); } if (filName.endsWith(".gif")) { filName = filName.replaceAll(".gif", ".jpg"); } } return filName; }
Using GZIP for HTTP request response compression
What is GZIP?
It is a compression format created by Jean-Loup Gailly and Mark Adler. Version 0.1 was first publicly released on October 31, 1992.
GZIP is based on the DEFLATE algorithm, which is a combination of LZ77 and Huffman coding. DEFLATE was intended as a replacement for LZW and other patent-encumbered data compression algorithms which, at the time, limited the usability of compress and other popular archivers.
Effect of compression on HTTP transport
The time it takes to transfer an HTTP request and response across the network can be significantly reduced by decisions made by front-end engineers. It's true that the end-user's bandwidth speed, Internet service provider, proximity to peering exchange points, etc. are beyond the control of the development team. But there are other variables that affect response times. Compression reduces response times by reducing the size of the HTTP response.
Starting with HTTP/1.1, web clients indicate support for compression with the Accept-Encoding header in the HTTP request. Accept-Encoding: gzip, deflate
If the web server sees this header in the request, it may compress the response using one of the methods listed by the client. The web server notifies the web client of this via the Content-Encoding header in the response. Content-Encoding: gzip
Gzip is the most popular and effective compression method at this time. It was developed by the GNU project and standardized by RFC 1952. The only other compression format you're likely to see is deflate, but it's less effective and less popular.
Order of reduction in request response size
Gzipping generally reduces the response size by about 70%. Approximately 90% of today's Internet traffic travels through browsers that claim to support gzip. If you use Apache, the module configuring gzip depends on your version: Apache 1.3 uses mod_gzip while Apache 2.x uses mod_deflate.
This is just a start. If the request and response are soap based or any other xml protocol, this compression can be more than 90 %.
Issues with Compression
There are known issues with browsers and proxies that may cause a mismatch in what the browser expects and what it receives with regard to compressed content. Fortunately, these edge cases are dwindling as the use of older browsers drops off. The Apache modules help out by adding appropriate Vary response headers automatically.
Servers choose what to gzip based on file type, but are typically too limited in what they decide to compress. Most web sites gzip their HTML documents. It's also worthwhile to gzip your scripts and stylesheets, but many web sites miss this opportunity. In fact, it's worthwhile to compress any text response including XML and JSON. Image and PDF files should not be gzipped because they are already compressed. Trying to gzip them not only wastes CPU but can potentially increase file sizes.
Gzipping as many file types as possible is an easy way to reduce page weight and accelerate the user experience.
It is a compression format created by Jean-Loup Gailly and Mark Adler. Version 0.1 was first publicly released on October 31, 1992.
GZIP is based on the DEFLATE algorithm, which is a combination of LZ77 and Huffman coding. DEFLATE was intended as a replacement for LZW and other patent-encumbered data compression algorithms which, at the time, limited the usability of compress and other popular archivers.
Effect of compression on HTTP transport
The time it takes to transfer an HTTP request and response across the network can be significantly reduced by decisions made by front-end engineers. It's true that the end-user's bandwidth speed, Internet service provider, proximity to peering exchange points, etc. are beyond the control of the development team. But there are other variables that affect response times. Compression reduces response times by reducing the size of the HTTP response.
Starting with HTTP/1.1, web clients indicate support for compression with the Accept-Encoding header in the HTTP request. Accept-Encoding: gzip, deflate
If the web server sees this header in the request, it may compress the response using one of the methods listed by the client. The web server notifies the web client of this via the Content-Encoding header in the response. Content-Encoding: gzip
Gzip is the most popular and effective compression method at this time. It was developed by the GNU project and standardized by RFC 1952. The only other compression format you're likely to see is deflate, but it's less effective and less popular.
Order of reduction in request response size
Gzipping generally reduces the response size by about 70%. Approximately 90% of today's Internet traffic travels through browsers that claim to support gzip. If you use Apache, the module configuring gzip depends on your version: Apache 1.3 uses mod_gzip while Apache 2.x uses mod_deflate.
This is just a start. If the request and response are soap based or any other xml protocol, this compression can be more than 90 %.
Issues with Compression
There are known issues with browsers and proxies that may cause a mismatch in what the browser expects and what it receives with regard to compressed content. Fortunately, these edge cases are dwindling as the use of older browsers drops off. The Apache modules help out by adding appropriate Vary response headers automatically.
Servers choose what to gzip based on file type, but are typically too limited in what they decide to compress. Most web sites gzip their HTML documents. It's also worthwhile to gzip your scripts and stylesheets, but many web sites miss this opportunity. In fact, it's worthwhile to compress any text response including XML and JSON. Image and PDF files should not be gzipped because they are already compressed. Trying to gzip them not only wastes CPU but can potentially increase file sizes.
Gzipping as many file types as possible is an easy way to reduce page weight and accelerate the user experience.
Wednesday, June 29, 2011
Writing method interceptors using Spring AOP
Spring is a great Java technology that has become a very popular application framework during the past few years. My intention is not to go through the whole concepts and architectural details of the framework, because that kind of information can be easily looked up starting at http://www.springframework.org. As the article title indicates, I intend to provide hands-on examples showing the minimal requirements to bundle certain Spring functionalities in your Java applications. So, because I will not go into the “what’s under the hood” approach unless absolutely necessary, most of the examples might require the knowledge of basic Spring concepts. Anyway, the basic idea is that you must RTFM before deciding if Spring is right for your application.
The first example is a short look at a simple method intercepting strategy. You can read all about this and the whole Spring AOP API here.The source code for this example can be found here. In the project directory run
For the beginning let’s consider that we have the service
In order to print out the performance statistics on the method call, we must first implement the interceptor that actually calculates the execution time for this method. To do this we need to implement the
Next we need to proxy our service in order to obtain an instance whose methods are being intercepted by our
The key in this XML snippet is Spring’s built-in class
As everything seems to be packed pretty nice, all we need to do now is to have our service instantiated using Spring and call itâs
So we need to look up the
Normally, after the method
Except from the
Please note that basic performance monitoring can also be achieved by using Spring’s built-in
I hope you find this article useful.
The first example is a short look at a simple method intercepting strategy. You can read all about this and the whole Spring AOP API here.The source code for this example can be found here. In the project directory run
ant compile run
to launch the application.For the beginning let’s consider that we have the service
MyService
that that has a method doSomething()
performing an operation which takes a long time to execute. Below you can see the (pretty dumb) code of this method. public class MyService { public void doSomething() { for (int i = 1; i < 10000; i++) { System.out.println("i=" + i); } } }
In order to print out the performance statistics on the method call, we must first implement the interceptor that actually calculates the execution time for this method. To do this we need to implement the
org.aopalliance.intercept.MethodInterceptor
interface shipped with Spring. This is actually a callback providing access to the actual call of the methods of our service. The JavaDoc for this interface is here.public class ServiceMethodInterceptor implements MethodInterceptor { public Object invoke(MethodInvocation methodInvocation) throws Throwable { long startTime = System.currentTimeMillis(); Object result = methodInvocation.proceed(); long duration = System.currentTimeMillis() - startTime; Method method = methodInvocation.getMethod(); String methodName = method.getDeclaringClass().getName()
+ "." + method.getName(); System.out.println("Method '" + methodName
+ "' took " + duration + " milliseconds to run"); return null; } }
Next we need to proxy our service in order to obtain an instance whose methods are being intercepted by our
ServiceMethodInterceptor
. To achieve this, all it takes is a little magic in Spring’s bean configuration file, as you can see below.<beans> <bean id="myService" class="com.test.MyService"> </bean> <bean id="interceptor" class="com.test.ServiceMethodInterceptor"> </bean> <bean id="interceptedService" class="org.springframework
.aop.framework.ProxyFactoryBean"> <property name="target"> <ref bean="myService"/> </property> <property name="interceptorNames"> <list> <value>interceptor</value> </list> </property> </bean> </beans>
The key in this XML snippet is Spring’s built-in class
org.springframework.aop.framework.ProxyFactoryBean
which provides the actual proxying of our service. In order to obtain the desired effect we must set the target
and interceptorNames
properties for this bean. The target
property represents the name of the bean that we want to proxy, which in our case is the myService
bean. The interceptorNames
property holds a list of bean names that will be used as interceptors for the proxied bean. So, yes, you can define more than one interceptor for your bean.As everything seems to be packed pretty nice, all we need to do now is to have our service instantiated using Spring and call itâs
doSomething
method.public class Test { public static void main(String[] args) { ApplicationContext ctx =
new ClassPathXmlApplicationContext("com/test/applicationContext.xml"); MyService myService = (MyService)ctx.getBean("interceptedService"); myService.doSomething(); } }
So we need to look up the
interceptedService
bean in order to get the proxied service, but if we choose to remove the performance monitor we can simply lookup the initial myService
bean.Normally, after the method
doSomething
has run, you should see, as the last output line, something like this:Method 'com.test.MyService.doSomething' took 281 milliseconds to run
Except from the
MethodInterceptor
Spring also offers other method interception strategies. For example you can choose to handle a method execution right before or immediately after the actual call, or when an exception is thrown during the execution of your method. The reference documentation about these types of interceptors that Spring offers is available here.Please note that basic performance monitoring can also be achieved by using Spring’s built-in
PerformanceMonitorInterceptor
. We used this logic just as a sample for method intercepting, but as your intuition might tell you, this is just one of the many things you can do with this feature of Spring. For example, if you need to implement a fine-grained security module, you might choose not to allow the method call to execute if the user does not have rights on the business method. So, basically, you will have to see for yourself how you can use this functionality in your application.I hope you find this article useful.
Dynamic in-memory compilation using javax.tools
Once I needed to calculate expressions dynamically, so one option I got was dynamic in-memory compilation. I searched, and to my surprise, almost at the end of Java 6 (I am expecting Java 7 to be out soon…), I noticed this feature under javax.tools package. May be I am the last one to notice this!!. Similar feature is present in .net as well. But I had to leave this idea, because may be I thought there is some memory overhead, designed whole expression calculator application. But still let's see how dynamic compilation is possible in java.
This dynamic compiler API is included with Java 6 under javax.tools package.
How does it work?
javax.tools package has all the required interfaces and classes. Here, we will see how to compile a simple “HelloWorld” program source code stored in an in-memory String variable.
Able to compile a piece of source code stored in a string variable, WOW! this is interesting! isn’t it?
Follow the sequence of steps mentioned below. I explained these steps with the required code-snippets at that point. The full version of source code is available at the end of the article.
Important API's
The most important classes in this API are,
- JavaCompiler - This is used to create a compilation task
- JavaCompiler.CompilationTask – The compilation task, on which we execute compile operation using it’s call method
- JavaFileManager:Manages how the compiler read and writes to the files
- JavaFileObject: The file object that abstracts the java source and class files
- DiagnosticListener: This listens to the compilation diagnostic events
- ToolProvider: Which is used to get the compiler object from the underlying platform.
Looking at the Example
1. Build the source code to compile; we can read it from file system, retrieve from database, or generate it dynamically in memory!!
Get the source code to be dynamically compiled ready:
StringBuilder src = new StringBuilder(); src.append("public class DynaClass {\n"); src.append(" public String toString() {\n"); src.append(" return \"Hello, I am \" + "); src.append("this.getClass().getSimpleName();\n"); src.append(" }\n"); src.append("}\n");
Create a JavaFileObject instance for each of the compilation unit.
If the source is not from file system, then we need to write a class implementing from JavaFileObject interface. Java 6 provides a sample implementation of this in the form of SimpleJavaFileObject. We can extend from this and customize it as per our needs.CharSequenceJavaFileObject implements the SimpleJavaFileObject interface and represents the source code we want to compile. Normally instances of SimpleJavaFileObject would point to a real file in the file system, but in our case we want it to represent a StringBuilder createdy by us dynamically. Let’s see how it goes:
import java.net.URI; import javax.tools.SimpleJavaFileObject; import javax.tools.JavaFileObject.Kind; public class CharSequenceJavaFileObject extends SimpleJavaFileObject { /** * CharSequence representing the source code to be compiled */ private CharSequence content; /** * This constructor will store the source code in the * internal "content" variable and register it as a * source code, using a URI containing the class full name * * @param className * name of the public class in the source code * @param content * source code to compile */ public CharSequenceJavaFileObject(String className, CharSequence content) { super(URI.create("string:///" + className.replace('.', '/') + Kind.SOURCE.extension), Kind.SOURCE); this.content = content; } /** * Answers the CharSequence to be compiled. It will give * the source code stored in variable "content" */ public CharSequence getCharContent( boolean ignoreEncodingErrors) { return content; } }
If the source code is from file system, then create JavaFileObject instances from the File objects read from the file system.
/*Java source files read from file system*/ File []files = new File[]{file1, file2} ; Iterable<? extends JavaFileObject> compilationUnits1 = fileManager.getJavaFileObjectsFromFiles(Arrays.asList(files1));
Though I am keeping the file object as in-memory one, ie not from file system.
Representing the compiled byte code
Next we must define the class representing the output of the compilation – compiled byte code. It is needed by the ClassFileManager which we will describe later. Compiler takes the source code, compiles it and splits out a sequence of bytes which must be stored somewhere. Normally they would be stored in a .class file but in our case we just want to make a byte array out of it. Here is a class that fulfills our needs:import java.io.ByteArrayOutputStream; import java.io.IOException; import java.io.OutputStream; import java.net.URI; import javax.tools.SimpleJavaFileObject; public class JavaClassObject extends SimpleJavaFileObject { /** * Byte code created by the compiler will be stored in this * ByteArrayOutputStream so that we can later get the * byte array out of it * and put it in the memory as an instance of our class. */ protected final ByteArrayOutputStream bos = new ByteArrayOutputStream(); /** * Registers the compiled class object under URI * containing the class full name * * @param name * Full name of the compiled class * @param kind * Kind of the data. It will be CLASS in our case */ public JavaClassObject(String name, Kind kind) { super(URI.create("string:///" + name.replace('.', '/') + kind.extension), kind); } /** * Will be used by our file manager to get the byte code that * can be put into memory to instantiate our class * * @return compiled byte code */ public byte[] getBytes() { return bos.toByteArray(); } /** * Will provide the compiler with an output stream that leads * to our byte array. This way the compiler will write everything * into the byte array that we will instantiate later */ @Override public OutputStream openOutputStream() throws IOException { return bos; } }
At some point of the compilation, compiler will call openOutputStream() method of our JavaClassObject class and write there the compiled byte code. Because the openOutputStream() method returns a reference to the bos variable, everything will be written there, so that afterwards we will be able to get the byte code from it.
FileManager - putting the bytecode into JavaClassObject
We will also need something like a “file manager” that will tell the compiler to put the compiled byte code into an instance of our JavaClassObject class instead of putting it to a file. Here it is:import java.io.IOException; import java.security.SecureClassLoader; import javax.tools.FileObject; import javax.tools.ForwardingJavaFileManager; import javax.tools.JavaFileObject; import javax.tools.StandardJavaFileManager; import javax.tools.JavaFileObject.Kind; public class ClassFileManager extends ForwardingJavaFileManager { /** * Instance of JavaClassObject that will store the * compiled bytecode of our class */ private JavaClassObject jclassObject; /** * Will initialize the manager with the specified * standard java file manager * * @param standardManger */ public ClassFileManager(StandardJavaFileManager standardManager) { super(standardManager); } /** * Will be used by us to get the class loader for our * compiled class. It creates an anonymous class * extending the SecureClassLoader which uses the * byte code created by the compiler and stored in * the JavaClassObject, and returns the Class for it */ @Override public ClassLoader getClassLoader(Location location) { return new SecureClassLoader() { @Override protected Class<?> findClass(String name) throws ClassNotFoundException { byte[] b = jclassObject.getBytes(); return super.defineClass(name, jclassObject .getBytes(), 0, b.length); } }; } /** * Gives the compiler an instance of the JavaClassObject * so that the compiler can write the byte code into it. */ public JavaFileObject getJavaFileForOutput(Location location, String className, Kind kind, FileObject sibling) throws IOException { jclassObject = new JavaClassObject(className, kind); return jclassObject; } }
Function getClassLoader() will be called by us to get a ClassLoader instance for instantiating our compiled class. It returns an instance of SecureClassLoader modified by the function findClass(), which in our case gets the compiled byte code stored in the instance of JavaClassObject, defines a class out of it with the function defineClass() and returns it.
Writing our dynamic compiler
import java.util.ArrayList; import java.util.List; import javax.tools.JavaCompiler; import javax.tools.JavaFileManager; import javax.tools.JavaFileObject; import javax.tools.ToolProvider; import com.vaani.compiler.files.CharSequenceJavaFileObject; import com.vaani.compiler.files.ClassFileManager; public class DynamicCompiler { private JavaFileManager fileManager ; private String fullName; private String sourceCode; /** * @param fullName_ Full name of the class that will be compiled. If class should be in some package, fullName should contain it too (ex. "testpackage.DynaClass") * @param SrcCode_ Here we specify the source code of the class to be compiled */ public DynamicCompiler(String fullName_, String SrcCode_){ fullName = fullName_; sourceCode=SrcCode_; fileManager = initFileManager(); } public JavaFileManager initFileManager(){ if(fileManager!=null) return fileManager; else { JavaCompiler compiler = ToolProvider.getSystemJavaCompiler(); fileManager = new ClassFileManager(compiler .getStandardFileManager(null, null, null)); return fileManager; } } public void compile(){ // We get an instance of JavaCompiler. Then // we create a file manager // (our custom implementation of it) JavaCompiler compiler = ToolProvider.getSystemJavaCompiler(); // Dynamic compiling requires specifying // a list of "files" to compile. In our case // this is a list containing one "file" which is in our case // our own implementation (see details below) List<JavaFileObject> jfiles = new ArrayList<JavaFileObject>(); jfiles.add(new CharSequenceJavaFileObject(fullName, sourceCode)); // We specify a task to the compiler. Compiler should use our file // manager and our list of "files". // Then we run the compilation with call() compiler.getTask(null, fileManager, null, null, null, jfiles).call(); } public void run() throws InstantiationException, IllegalAccessException, ClassNotFoundException{ // Creating an instance of our compiled class and // running its toString() method Object instance = fileManager.getClassLoader(null) .loadClass(fullName).newInstance(); System.out.println(instance); } }
As you see the code we want to compile is stored in the variable sourceCode. Also our dynamic compiler object takes 2 parameters, the fully qualified classname and its sourceCode. In the constructor we get both initialized and then we initialize filemanager as well, which will hold the classes in it.
Lets, first focus on the compile method. After we define it, we print it to the console, get an instance of the compiler, put the source code into an object representing a source file.. The real compilation starts when we call the call() method of the compilation task. Then we get the Class representing our compiled class from the file manager, instantiate our class and print it to the console, using the toString() function that we implemented in the code.
There are three classes used in the code that are not available in the JDK and hence we have to implement them by ourselves – CharSequenceJavaFileObject, JavaClassObject and ClassFileManger.
These classes have already been implemented by us and explained as well.
In the run method, we call getClassLoader() to get the class we need to get instance of.
Running the program
The main method
import com.vaani.compiler.DynamicCompiler; import com.vaani.compiler.src.SourceCodes; public class DynaCompTest { public static void main(String[] args) throws Exception { // Full name of the class that will be compiled. // If class should be in some package, // fullName should contain it too // (ex. "testpackage.DynaClass") String fullName = SourceCodes.strDynaClassFullName; // Here we get and specify the source code of the class to be compiled String src = SourceCodes.getDynaClassSource(); DynamicCompiler uCompiler = new DynamicCompiler(fullName, src); uCompiler.compile(); uCompiler.run(); } }
Output
Now that we have all our classes ready, lets compile them and run the program. We should get an output like this:public class DynaClass { public String toString() { return "Hello, I am " + this.getClass().getSimpleName(); } } Hello, I am DynaClass
Application of dynamic compilation
As introduced in the beginning, I needed this for expression calculator, for expressions such as “y=2*(sin(x)+4.0)”. Using dynamic compilation you don’t have to parse it any more by yourself, you could just compile it and get a fast, optimized function representing this expression. You can read about it (and much more about dynamic compilation generally) here.
Some other usage is creating dynamic classes for accessing data stored in JavaBeans. Normally you would have to use reflection for it, but reflection is very slow and its generally better to avoid using it when possible. Dynamic compilation allows you to minimize the use of reflection in a library that handles JavaBeans. How? We will try to show it in one of our next posts, so stay tuned!
Subscribe to:
Posts (Atom)