DigitalJoel

2014/01/10

Exposing a read-only view of a Spring-MVC web service

Alright, so this is actually more flexible than just a read-only view, but that was the case that prompted me to play around with things so that’s where I’m starting. I was partially inspired by a co-worker’s blog entry regarding creating resource filters with jersey and jaxrs 2.0.

So down to the scenario. I have a simple CRUD webservice that I’ve implemented in Spring-MVC. For my demonstration I used Spring Boot, but you can do it any way you want. One key is that this solution depends on a new feature found in Spring Framework version 4.0.

In my webservice I have a @Controller that has @RequestMappings for GET, PUT, POST, and DELETE, following the normal REST semantics for each method. Now, I have this webservice securely deployed in my production environment and all of my internal services can hit it and everything is awesome.

Now let’s pretend I want to expose some of the resources on the big, bad internet. I want to expose all the GET resources so my front end developers can read the information and put it in a web page, or so my mobile apps can get at it, but I don’t really want to expose the ability for them to create, update, or delete information. Now I’ve got a couple of options.

Option 1

I create a new webservice.  It shares the dependencies of the original so it has access to all the same services, but the controller doesn’t contain any RequestMappings other than the GET resources I want to expose.  This is very secure because I have total control over what is available.  IF the original service was designed appropriately so the Controllers don’t contain any business logic then you can easily reuse all of the logic in the previous webservice.  If not, then it’s a good opportunity to get that done I guess.  On the downside, you now have two artifacts to maintain and deploy.

Option 2

I create a webservice that will proxy requests from the big, bad internet and send them to my internal webservice.  The proxy returns a 404 for any resource/method that should not be exposed, and forwards other requests on to the internal webservice.  Again, my service is secure and I can manage which of the resources are exposed.  Also, again, I have two deployables, and this time they aren’t nearly as related as they were before.  The proxy can be very thin, possibly something as simple as nginx or apache with appropriate rules.

Option 3

This is the option I will explore.  With this option, I modify my webservice so that it can be deployed internally AND externally and lock down the resources that shouldn’t be exposed to the public without having to create a separate deployable artifact.  We will simply annotate those request handlers that should be exposed to the public, basically forming a white-list, and all those that are not explicitly exposed will be restricted from view when certain conditions are met.

In addition, this solution will automatically apply a Jackson JsonView to restrict which properties of the data are exposed, not just which request mappings are exposed.  This will allow us to give a restricted view of the response for the general public on the big bad internet, and the full data for those hitting our internal deployment of the webservice.  We would still be deploying to two environments, one for the public and one for internal, but it would be the same artifact in both places.

First, we are going to use the new @Conditional annotation that was introduced with Spring 4.0.  It allows you to conditionally create a Spring bean.  We will use conditionally defined beans to modify the behavior of the application at runtime.

To The Code

First, the Condition that allows us to change the behavior of the application without having to change any code. My condition is based on the IP address assigned to the server. You could modify the condition to whatever fits your needs. Maybe it checks an environment variable or something. It’s important to note that this condition is evaluated when the bean is created, so if it’s a singleton bean it’ll only be evaluated once. If you are looking to have the condition depend on something from the client then it would probably have to be a request scoped bean, but I haven’t checked to see if that actually works or not. It seems like it should.

/**
 * Condition to check if we are in production or not.
 */
public class ProductionCondition implements Condition {

  @Override
  public boolean matches(ConditionContext context, AnnotatedTypeMetadata meta) {
    Enumeration ifaces;
    try {
      ifaces = NetworkInterface.getNetworkInterfaces();
      while ( ifaces.hasMoreElements()) {
        NetworkInterface iface = ifaces.nextElement();
        Enumeration addresses =  iface.getInetAddresses();
        while ( addresses.hasMoreElements()) {
          InetAddress address = addresses.nextElement();
          // Set whatever your public, production IP Address space is here!
          if ( address.getHostAddress().startsWith("192.168" )) {
            // If we match, then return true so the bean annotated with this conditional will be created.
            return true;
          }
        }
      }
    }
    catch (SocketException e) {
    }
    return false;
  }
}

Now we can use the above Condition to conditionally create Spring beans.

Here’s my Spring Boot application.  It also defines other beans for my spring-data-jpa repositories, but those aren’t relevant to what we are doing so I’ve left them out.

@Configuration
@ComponentScan
@EnableAutoConfiguration
@EnableJpaRepositories
public class Application {

  public static void main (String[] args ) {
    SpringApplication.run(Application.class, args );
  }

  @Configuration
  @Conditional(ProductionCondition.class)
  static class WebConfig extends WebMvcConfigurerAdapter {
    @Override
    public void configureMessageConverters(List<HttpMessageConverter> converters) {
      MappingJackson2HttpMessageConverter converter = new MappingJackson2HttpMessageConverter();
      ObjectMapper mapper = new ObjectMapper() {
        private static final long serialVersionUID = 1L;
        @Override
        protected DefaultSerializerProvider _serializerProvider(SerializationConfig config) {
          return super._serializerProvider(config.withView(Views.Public.class));
        }
      };
      mapper.configure(MapperFeature.DEFAULT_VIEW_INCLUSION, false);
      converter.setObjectMapper(mapper);
      converters.add(converter);
    }
  }

  /**
   * Only create this bean if we are in "production" mode.
   * @return
   */
  @Bean
  @Conditional(ProductionCondition.class)
  public MappedInterceptor publicHandlerInterceptor() {
    return new MappedInterceptor(null, new PublicHandlerInterceptor());
  }

  // Other beans here for JPA configuration
}

Notice that in the application I have two @Conditional beans. One is a new HandlerInterceptor that I’ll show in a second. The other is a full @Configuration. Because the publicHandlerInterceptor @Bean definition returns a MappedInterceptor it will automatically be configured within the Spring MVC application. If it returned a HandlerInterceptor then more work would have to be done to register it with the Spring MVC application.

Secondly, notice that the Conditional Configuration class extends the WebMvcConfigurerAdapter for allowing me to easily configure Spring MVC-type functionality. Sadly, configuring a custom Jackson ObjectMapper in Spring is much more painful (IMO) than it ought to be, so I’m going to get off on a bit of a tangent. Skip to the next section if you are confident in your ObjectMapper abilities.

ObjectMapper Tangent

It would be fantastic if I could configure the ObjectMapper used for a @ResponseBody by simply defining a @Bean named objectMapper and be good to go. Sadly, that’s not the case. I had to add the MessageConverter in the configuration, and set the ObjectMapper for that MessageConverter. Now, here’s the rub. I kept trying to make my configuration changes to the ObjectMapper by calling getSerializationConfig().blah(). Jackson SerializationConfig is immutable Calling getSerializationConfig() and then all of the handy .with(MappingFeature) or whatever just doesn’t work because it simply returns a new instance of SerializationConfig, but doesn’t modify the one that is in the ObjectMapper. You can see my learning process for this at StackOverflow

Back to the Show

So, the reason I needed to modify the ObjectMapper configuration was so that I could make it always use a given Jackson JsonView for every @ResponseBody encountered. The custom implementation of the ObjectMapper I pasted was the first way I found to configure it to always use the JsonView I specified, otherwise I had to call writeWithView on the writer, and I wasn’t sure where to do that. This configuration gives us the white-list of data properties that should be serialized in each response.

To use it, simply annotate the object returned as your @ResponseBody with the @JsonView annotation from Jackson, something like:

  @JsonView(value={Views.Public.class})
  public String getName() {
    return name;
  }

Securing the RequestMappings

The Application configuration has a conditional bean for a HandlerInterceptor, which looks like this:

public class PublicHandlerInterceptor extends HandlerInterceptorAdapter {
  @Override
  public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
    HandlerMethod method = (HandlerMethod)handler;
    if ( method.getMethodAnnotation(Public.class) != null ) {
      return true;
    }
    response.setStatus(404);
    return false;
  }
}

This HandlerInterceptor will be evaluated for every RequestMapping. Here, we look at the actual method that is being called to handle the request. If it is annotated with our custom @Public annotation, then we allow the request to proceed by returning true from the HandlerInterceptor. If it isn’t, then we return false and send a 404 to the client.

Finally, here’s the Public annotation definition

@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
public @interface Public {}

And its usage:

  @Public
  @RequestMapping(method=RequestMethod.GET, produces=MediaType.APPLICATION_JSON)
  public @ResponseBody Iterable getCollection(
    @RequestParam(value="ids", required=false) List ids,
    @RequestParam(value="limit", required=false, defaultValue="100") int limit ) {
      // lookup a collection of MyObjects and return them
  }

  @RequestMapping( value="/{id}", method=RequestMethod.PUT, consumes=MediaType.APPLICATION_JSON, produces=MediaType.APPLICATION_JSON)
  public @ResponseBody MyObject putValue(@PathVariable Long id, @RequestBody MyObject d ) {
    // do some things to update an object and return the representation of the updated object
  }

With this in place, I’m able to deploy my webservice (with spring-boot it’s just a jar that contains embedded tomcat!) and run it without any further alterations. The getCollection method would be available in both deployment locations. The putValue handler would only be available in those deployment locations that do NOT match the condition I have specified, so only those that are visible internally. The representation of MyObject is appropriate for the deployment location without any further changes to the webservice either. I merely select the properties of MyObject that I want exposed publicly and annotate them with the appropriate JsonView.

A white-list approach ensures that nothing slips through the cracks to the big, bad internet just because a developer forgot to restrict it. Instead, they must evaluate each request handler and data property and explicitly expose it in the public view.

I could have had my proof of concept developed and tested in under 2 hours had I not run into my difficulties configuring the ObjectMapper. That’s a lesson I won’t soon forget though. I tested all this by making the condition match my IP address when I was connected to my work VPN. When I started the application up and I was connected it would restrict the request handlers and the serialized properties. If I was not connected I could execute any method and would see all of the data properties.

It’s probably not perfect a perfect solution. Does such a thing exist? The one question I’ve thought of is what happens if my code is already using JsonViews? I’m not sure how it would play together. Nevertheless it is an interesting exploration of the capabilities of the @Conditional annotation and HandlerInterceptors.

2011/02/05

Using Mockito To Test Spring MVC Ajax Interaction

Filed under: java, spring, testing — Tags: , , , , , , , — digitaljoel @ 4:36 pm

So, I shared in Ajax Post to Spring MVC Controller what I learned about making an ajax post to a Spring MVC Controller. Then I shared in Mock Testing Spring MVC Controller what I learned about using Mockito to test my Spring MVC controller. So what about testing my RequestHandler that handles the ajax post and returns a JSON object? Well, as Samuel L. Jackson says in Jurassic Park, “Hold on to your butts”

Here’s the method that handles the ajax post of form data.

    @RequestMapping( value="answer/new", method=RequestMethod.POST)
    public ResponseEntity<String> newAnswer( @RequestParam(value="answerSeverity", required=true ) String severity
            , @RequestParam( value="answerText", required=true ) String text
            , @RequestParam( value="requiresReason", required=false, defaultValue="false" ) boolean requiresReason
            , @RequestParam( value="answerReasons", required=false ) List<Long> reasonKeys
            )
    {
        Severity sev = Severity.valueOf( severity );
        SurveyAnswer answer = new SurveyAnswer( text, sev );
        answer.setRequiresReason( requiresReason );
        if ( requiresReason )
        {
            // add all the reasons
            List<SurveyAnswerReason> reasons = surveyService.findReasonsByKey( reasonKeys );
            for( SurveyAnswerReason reason : reasons )
            {
                answer.addReason( reason );
            }
        }
        answer = surveyService.persist( answer );
        this.getAnswers( sev ).add( answer );
        return createJsonResponse( answer );
    }

    private ResponseEntity<String> createJsonResponse( Object o )
    {
        HttpHeaders headers = new HttpHeaders();
        headers.set(  "Content-Type", "application/json" );
        String json = gson.toJson( o );
        return new ResponseEntity<String>( json, headers, HttpStatus.CREATED );
    }

You can read the previous post for information on what’s going on here, but basically, we handle the form post, create a new SurveyAnswer, and then return the created answer as a JSON object using the createJsonResponse method.

In order to mock test this, I’m going to have to mock all the calls to the surveyService methods. That would be findReasonsByKey, and persist. The persist was a bit tricky because I wanted it to just return the answer that was passed as an argument to ensure that the controller was creating the answer correctly. Here’s the code to do that.


        when( surveyService.persist( any( SurveyAnswer.class ))).thenAnswer(
                new Answer<SurveyAnswer>()
                {
                    @Override
                    public SurveyAnswer answer( InvocationOnMock invocation ) throws Throwable
                    {
                        Object[] args = invocation.getArguments();
                        return (SurveyAnswer) args[0];
                    }
                });
        when ( surveyService.findReasonsByKey( anyCollectionOf( Long.class ))).thenReturn( getReasons() );

I put it in my @Before annotated setup method in my unit test. I didn’t come up with it myself, I adapted it from an excellent answer to this question on StackOverflow.com. That snippet allows me to just return the argument passed to the method, which is basically what JPA would do, other than setting the key and version, which I don’t really need for my test anyway. The mocked out findReasonsByKey just returns a list of objects that I’m creating elsewhere for testing purposes only.

So, on to the test. Here’s the code:


    @Test
    public void testNewAnswerWithReasons()
    {
        ResponseEntity<String> response = controller.newAnswer( answerSeverity.name(), answerText,
                true, getReasonKeys() );
        assertEquals( "application/json", response.getHeaders().get( "Content-Type" ).get( 0 ));
        SurveyAnswer answer = gson.fromJson( response.getBody(), SurveyAnswer.class );
        assertEquals( getSingleAnswerWithReasons(), answer );
    }

There are some helper methods that create the object graph needed for the SurveyAnswer. It then calls the method on the controller (which is also initialized in the @Before setup method) and checks the result. I’m really looking for two things. First, that the response has the Content-Type set correctly to application/json, and second, that I get an answer that corresponds to the values I passed in. Here again, I use Google’s GSON library for converting from my JSON string to my Java object. Once that is done, I can just test for equality with the answer I’m expecting. Obviously, for that to work, you’ll need to make sure your equals method is correct, but that’s an issue well addressed elsewhere on the internet and well beyond the scope of this post.

2010/12/28

Spring MVC and JSR-303 Validation Groups

Filed under: development, java, spring — Tags: , , , , , , — digitaljoel @ 8:27 pm

@Valid. That wonderful little annotation. In my Spring controller I do something like this.

    @RequestMapping( value="/editAccount", method=RequestMethod.POST )
    public String postEditAccount( Model model, HttpServletRequest request, 
            @Valid AccountInfo info, BindingResult infoResult )

and everything is great. I know that all of the JSR-303 annotations I’ve put on my AccountInfo bean have been validated and the data is all correct and good. Let’s say that my AccountInfo bean looks something like this.

public class AccountInfo
{

    @NotNull
    private String username;
    @NotNull
    private String newUsername;
    @NotNull
    private String confirmNewUsername;
    
    @NotNull
    private String password;
    @NotNull
    private String newPassword;
    @NotNull
    private String confirmNewPassword;
    
    @NotNull
    private String firstName;
    @NotNull
    private String lastName;
    @NotNull
    private String phone;
    private String fax;
    
    // constructors, getters, setters, etc. down here
}

Suddenly, everything isn’t so hunky dory anymore. Fortunately for us, JSR-303 has a great mechanism for only validating some portion of the object. It’s known as validation groups. There’s plenty of information out there on them, so I’ll give you the extreme Reader’s Digest version. Basically, you specify a list of marker interfaces in your validation annotations, and then when you call the validator, you can also pass in a list of the marker interfaces that you would like to validate against. Ok, that sentence doesn’t make much sense unless you already know about groups. So, here’s a new version of AccountInfo that demonstrates.


public class AccountInfo
{

    @NotNull( groups={ChangeUsername.class} )
    private String username;
    @NotNull( groups={ChangeUsername.class} )
    private String newUsername;
    @NotNull( groups={ChangeUsername.class} )
    private String confirmNewUsername;
    
    @NotNull( groups={ChangePassword.class} )
    private String password;
    @NotNull( groups={ChangePassword.class} )
    private String newPassword;
    @NotNull( groups={ChangePassword.class} )
    private String confirmNewPassword;
    
    @NotNull
    private String firstName;
    @NotNull
    private String lastName;
    @NotNull
    private String phone;
    private String fax;
    
    // constructors, getters, setters, etc. down here
    
    public interface ChangePassword {};
    public interface ChangeUsername {};

}

Now we’ve told the validator that when we run the validator without any groups, we want it to validate firstName, lastName, and phone. If you don’t specify any groups, they get the Default.class group. BUT, if we run the validator passing in the AccountInfo.ChangePassword.class, then it will only validate password, newPassword, and confirmNewPassword. If we want to do both, then we can pass in AccountInfo.ChangePassword.class AND Default.class and it will validate both groups. That’s awesome sauce right there. Now, we can use this same backing bean in the page where the user is created which contains all the fields, we can use it in the edit account info page which only has the stuff validated by Default, we can use it in our change password page, and we can also use it in our change username page, and in each case, we only validate the portions that we are concerned about for those pages. One bean for all four pages.

With that worked out, we should be able to just add the groups to our @Valid annotation, right? Nope. Wait, what? All that work to put in validation groups and we can’t even use them with the JSR-303 sanctioned validation annotation? Yep, that’s right. There’s an improvement in Spring’s Jira to add a new @Valid annotation that will allow you to specify groups, but until that happens, you’ll have to run the validator yourself.

I think it sound worse than it is. As you can see in the controller method I put at the start of this post, that after each @Valid annotated object, you need to have the BindingResult in order to see the errors. Then, in your controller method you have to check the BindingResult in order to see if there are errors, if there are, do something, if not, do something else. So, how is that any different than having to just run the check yourself? Here’s what I did.


    /**
     * Test validity of an object against some number of validation groups, or
     * Default if no groups are specified.
     * 
     * @param result Errors object for holding validation errors for use in
     *            Spring form taglib. Any violations encountered will be added
     *            to this errors object.
     * @param o Object to be validated
     * @param classes Validation groups to be used in validation
     * @return true if the object is valid, false otherwise.
     */
    private boolean isValid( Errors result, Object o, Class<?>... classes )
    {
        if ( classes == null || classes.length == 0 || classes[0] == null )
        {
            classes = new Class<?>[] { Default.class };
        }
        Validator validator = Validation.buildDefaultValidatorFactory().getValidator();
        Set<ConstraintViolation<Object>> violations = validator.validate( o, classes );
        for ( ConstraintViolation<Object> v : violations )
        {
            Path path = v.getPropertyPath();
            String propertyName = "";
            if ( path != null )
            {
                for ( Node n : path )
                {
                    propertyName += n.getName() + ".";
                }
                propertyName = propertyName.substring( 0, propertyName.length()-1 );
            }
            String constraintName = v.getConstraintDescriptor().getAnnotation().annotationType().getSimpleName();
            if ( propertyName == null || "".equals(  propertyName  ))
            {
                result.reject( constraintName, v.getMessage());
            }
            else
            {
                result.rejectValue( propertyName, constraintName, v.getMessage() );
            }
        }
        return violations.size() == 0;
    }

Alright, it’s a pretty simple method, but we’ll walk through it.


        Validator validator = Validation.buildDefaultValidatorFactory().getValidator();
        Set<ConstraintViolation<Object>> violations = validator.validate( o, classes );

Here we get the validator instance and get the set of violations back. This is based on the validation groups that were passed in, or Default if there were none passed. I believe I read somewhere that you can actually get the Validator injected by Spring, but I haven’t played with it yet to find out. If you do and it works, let me know!

Next is the part where we take the JSR-303 validations and map them to Spring form errors.


            Path path = v.getPropertyPath();
            String propertyName = "";
            if ( path != null )
            {
                for ( Node n : path )
                {
                    propertyName += n.getName() + ".";
                }
                propertyName = propertyName.substring( 0, propertyName.length()-1 );
            }

We get the property name of the violation, which will hopefully map to the “path” in the spring input tag you are using. I haven’t tested this on anything with any depth (for instance, if your bean contains an object that also has validation annotations on it) so I’m not sure how it’ll work there. Once again, if you find out, leave a comment. Anyway, now that we have the property name, we can use that later on to tell Spring which field failed validation so the correct errors field can be shown.


            String constraintName = v.getConstraintDescriptor().getAnnotation().annotationType().getSimpleName();

Now we get the name of the constraint that failed. In all cases above, it would be NotNull. If you annotation is something like @Size( min=85 ) then the constraint would be Size. We use this so we can get error messages mapped the same way spring binding violations do, so if you are using custom messages in your messageSource for your fields or constraint messages, then this should work just the same.


            if ( propertyName == null || "".equals(  propertyName  ))
            {
                result.reject( constraintName, v.getMessage());
            }
            else
            {
                result.rejectValue( propertyName, constraintName, v.getMessage() );
            }

Finally, before returning the true or false, we have to add the violations to the Errors object. If we have a propertyName, then that means it’s a field error. If we don’t, it’s a global object error, which ought to happen if you use a class level validation annotation instead of a field level validation annotation. Yep, I’m going to say it one more time. I haven’t tested that yet, but it won’t be long before I do since I’ve got to get a class level validator to ensure the newPassword and confirmNewPassword fields contain the same value.

So, that’s a lot of explaining for a single method, but hopefully it shows that adding this validation really isn’t that much more difficult than checking the BindingResult that Spring gives back to you after the default @Valid processing. In places where I don’t have to use validation groups, I’m obviously still going to use the @Valid annotation as is, and if Spring gives me a new annotation I can use to run validation groups, I’ll jump to that and rip this out right away. BUT, until then, this will have to do.

Finally, here’s how I call that method in my controller.


    @RequestMapping( value="/editAccount", method=RequestMethod.POST )
    public String postEditAccount( Model model, HttpServletRequest request, 
            AccountInfo info, BindingResult infoResult )
    {
        if ( !isValid( infoResult, info, AccountInfo.ChangePassword.class, AccountInfo.ChangeUsername.class ))
        {
            return "editAccount";
        }
        // otherwise we process the form and do stuff.
    }

Obviously, you would want to move the isValid method out of this controller and into something that can be shared between Controllers.

2010/12/09

JSP Date Formatting

Filed under: development, java — Tags: , , , — digitaljoel @ 10:58 am

I had an input field that took a date type as mentioned in my previous post. Now the problem was, while I input it in the form MM/dd/yyyy, when I was displaying it on the page for the user to modify, it would come back with time, timezone, all sorts of crap definitely NOT in MM/dd/yyyy format. This caused the form submission to fail unless the user corrected the field every time because date conversion would fail with the huge, lame format.

One wrinkle was that I wanted to support internationalization. While our application currently doesn’t have anything other than en_US right now, I want to make sure I am getting the right date pattern for all instances. So, this is what I did.

<label id="birthday_label" for="birthday" title="<spring:message code="user.birthday.alt" />">
    <spring:message code="user.birthday" />
</label>

<spring:message code="dateFormat" var="dateFormat" />

<input id="birthday" name="birthday" type="text" 
        value="<fmt:formatDate value="${reg.birthday}" 
        type="date" pattern="${dateFormat}" />" />

I have my spring message bundle available, and in that bundle, I have a field called dateFormat which contains the string MM/dd/yyyy. It’s also the field used for doing the date conversion on input. I needed to get the format for use in the fmt:formatDate tag, which is given as the value of the input tag. I’m not a fan of tags within attribute values of tags, but what can you do.

Well, I couldn’t have a tag, in a tag, in an attribute of a tag. Apparently you can only take that so far. So, the trick was using the var of the spring:message to store the dateFormat in something that I could later reference as ${dateFormat} in the pattern of the fmt:formatDate.

2010/11/15

Spring MVC binding to Strings and Dates

Filed under: development, spring — Tags: , , , — digitaljoel @ 2:29 pm

I spent the entire morning trying to figure out how to get Spring MVC to allow for Null in my Date field.  I would get an exception if the Date value in the form was left Null.  Once I found that out, I wasn’t getting any validation messages for all the fields marked as @NotNull.  It turns out Spring just set the value to empty string instead of null if the field was empty.  Hibernate’s JSR-303 implementation has a @NotEmpty validation, but I decided to try to keep it to spec.

So, I implemented a custom @InitBinder for my @Controller and had an anonymous implementation of a custom editor all based off an answer on stackoverflow.com.  Finally, I found this bug logged against Roo

https://jira.springsource.org/browse/ROO-190

Using that single line in my @InitBinder method I was then able to set Dates to null.  For the second problem, I used this very helpful blog post by Stefan Reuter

http://blogs.reucon.com/srt/2007/11/25/spring_mvc_null_or_an_empty_string.html

So, now my @InitBinder method looks like this.

    @InitBinder
    public void allowEmptyDateBinding( WebDataBinder binder )
    {
        // Allow for null values in date fields.
        binder.registerCustomEditor( Date.class, new CustomDateEditor( new SimpleDateFormat( getDatePattern()), true ));
        // tell spring to set empty values as null instead of empty string.
        binder.registerCustomEditor( String.class, new StringTrimmerEditor( true ));
    }

And as simple as that I get null instead of empty string for my string values, and I can allow null values in my non-required date fields. Too bad it took me 6 hours this morning to find the answers.

2010/11/01

Accessing Spring Session Beans in JSP

Filed under: development — Tags: , , , — digitaljoel @ 5:23 pm

I’m using Spring MVC in a project.  I wanted to create a session scoped bean that I could reference directly from my JSP.  In this case, it was a bean for holding status or error messages for display in the UI.  It would keep a queue of messages, and would clear them when displayed.

The interface for my MessageUtil class was simple, with an addMessage method for adding a message to the queue, and a getFlashMessages method that gets all messages as a list and clears the internal queue.

The implementation could be equally simple.  Mine has a touch more code in order to pull the actual message text from a resource bundle, but the class definition is very simple

public class MessageUtilImpl implements MessageUtil, Serializable
{
// implementation here
}

In my spring context configuration file, I defined the bean as follows:

    <bean id="messageUtil" class="mypackage.MessageUtilImpl" scope="session">
        <aop:scoped-proxy proxy-target-class="false"/>
        <property name="messageSource" ref="messageSource" />
    </bean>

Where the messageSource contains the bundle for messages. The real ticket here is the aop:scoped-proxy configuration.

Since I wanted to inject this message utility class into my Spring MVC controllers (which are of a greater scope than session scope) it was puking at me. Adding aop:scope-proxy configuration to the bean definition (which apparently isn’t available yet as an annotation, which is why I had to configure in xml instead) allows Spring to use AOP to inject the session bean into the controller for the thread that is processing the request, tying my messageUtil to the one that has been constructed for the current session.

One item of note is the proxy-target-class attribute. If you set it to false, then spring aop will use a java interface based proxy. This means that your bean must have an interface and an implementation, and that everywhere you use the bean, you must reference it via the interface and NOT the implementation. Well… DUH. If you have an interface and an impl, and you are referencing the impl, then what can I say? If you set that value to true, then spring aop will use cglib (which now must be on your build path, probably runtime path) to proxy the implementation class, meaning you don’t need to have an interface and implementation, you can simple have a class. I didn’t want to need cglib, so I chose interface based proxy.

With that magic done, now all I had to do was reference my messageUtil bean in the jsps so I could call the getFlashMessages() method and display them.

Now, I’m no JSP guru. I’ve spent the last 3 years in JSF land. I’m sure I could wire this up to get the messages via ajax and do something super awesome… but I didn’t. I’m using the Pines Notify jQuery plugin to show a simple growl-type message.

<script type="text/javascript" >
    <c:forEach var="flashMessage" items="${sessionScope['scopedTarget.messageUtil'].flashMessages}" >
        $.pnotify( {
            pnotify_title: "${flashMessage.title}",
            pnotify_text: "${flashMessage.message}",
            pnotify_type: "${flashMessage.severity}",
            pnotify_shadow: true
        })
    </c:forEach>
</script>

Besides the aop:scoped-proxy, the thing that took the longest to figure out was how to get the stinking spring session bean. You can see that I’m accessing it like this

${sessionScope['scopedTarget.messageUtil'].flashMessages}

The answer is the ‘scopedTarget’ prefix to the bean name. Since it uses a dot in the name, you can’t use sessionScope.scopedTarget.messageUtil, so the way it’s referenced above is the only way I know how to do it. It took surprisingly long for me to find it.

I’m sure as soon as I publish this, someone will find a reference to it in the official spring documentation, but Adobe Reader didn’t find it in the 799 page Spring Documentation PDF I have.

That’s it for this one. Have fun with spring session scoped beans and jsp el.

2010/02/13

Spring Roo and JSF 2?

Filed under: development — Tags: , , , , , , , , — digitaljoel @ 2:07 pm

In my latest project we’ve been playing with Spring Roo.  You can read all about the advantages of it on that site, but the gist is that it creates your entities and JPA persistence code for them.  It will also configure your database connection, generate unit tests, generate a JSP/JSTL based UI with all of the Spring MVC controllers, and selenium tests for it.  It will also optionally integrate Spring Web Flow into the mix.  It’s basically a web based CRUD application in a box.

I spent the last 3 days trying to figure out how to use JSF 2 in Glassfish 3 with a Roo generated application.  Much of this pain may have been caused by my lack of experience with JPA and Spring, but here are a few of the difficulties I faced.

  • JSF 2 is not well supported in eclipse yet, whereas Netbeans 6.8 has excellent JSF 2 support.
  • All of the entities created by Roo just have properties.  All of the getters, setters, and other methods are all included in aspect files and compiled into the entity through AspectJ.  The AspectJ plugin for eclipse takes care of this if you have weaving enabled.  Guess what… There’s no aspect based plugin available for Netbeans, so you get a ton of compile problems in your Roo generated project if you try to open it within Netbeans.
  • The Roo projects are all maven based.  I have yet to see really good success with any maven integration within Eclipse.  Netbeans maven integration is spot on.  For the past 2.5 years I’ve been developing within Eclipse and simply doing all of the maven stuff from the command line because it’s much simpler than dealing with Maven in Eclipse.
  • JSF can’t be used to call Spring MVC controllers.  The Spring MVC controllers generated by Roo are nice.  They are simple and clean, and having them managed by Roo is just that much better.  Unfortunately, Spring MVC is stateless, whereas JSF is stateful, so calling any Spring MVC controller from JSF without using Spring’s Web Flow just doesn’t work.
  • JSF 2 can’t be used in the current version of Spring Web Flow.  Even though a sub component of Web Flow is Spring Faces, the current version of Spring Faces only supports JSF 1.x.  As soon as I put Spring Faces in the classpath, my JSF 2 app wouldn’t render any .xhtml files for me.  Removing Spring Faces from the classpath allowed me to render .xhtml with JSF 2.
  • The transaction type specified in the Roo generated persistence unit is RESOURCE_LOCAL.  Glassfish 3 told me I had to use JTA.  I believe Roo can generate a jndi based data source, which I haven’t tried yet.  I ended up manually modifying the persistence unit (after a lot of time on Google) to be jndi based and have the JTA transaction type.  Unfortunately, this means that all of the Roo generated unit tests fail.  I suspect if I had Roo generate the jndi data source then the unit tests would also pass, so hopefully this issue was self inflicted.  Update: I created a simple project with a jndi datasource.  Unfortunately, all of the unit tests fail.   There is some information in the (closed as won’t fix) bug at https://jira.springsource.org/browse/ROO-311 It would be nice if the unit tests that are created for Roo would use the Spring SimpleNamingContext (and I may be just making stuff up now because I don’t know a lot about Spring/JNDI stuff) or something to allow the unit tests to run if the persistence setup uses jndi.

So why would I continue in my endeavor to use JSF 2 with my Roo project, and where do I go from here?

We are trying to make development as easy as possible for the designers.  They have next to no Java experience at best and just want to tweak styles, layout, and images.  I believe JSF can do a great job of giving us that separation.  I can get the controls working in the form, and they can move them around and do whatever they want with them.  Also, with the composite components in JSF 2, we should be able to make some very nice reusable components for the designers to use in our application.  Finally, many of those components we should be able to share with other projects in the company.

So what is the solution to all of the above problems?

  • Education.  Obviously learning more about JPA and Spring will make the whole journey a little easier for me.
  • Separate projects.  If we have all of the domain model in one maven project, and have the web based UI in another (which is how it really ought to be anyway) then the java developers can use Eclipse or STS for their IDE and get the AspectJ support, and the web designers can use Netbeans 6.8 for the JSF 2 project and get the great support there.  Because the domain model will simply be a jar dependency for the web project all of the aspect based methods will be compiled into the class, meaning we won’t need any Aspect support within Netbeans.

What would it take to get a JSF 2 UI generator into Roo?

  • The Roo team has released a nearly 100 page document about Roo.  Unfortunately, all of the parts about how to implement a new addon for Roo are marked as TBC.  There’s precious little detail on the web too.  Unfortunately, I’m not quite ready to dive into the source code to try to figure it out.  So, a simple Roo addon isn’t quite there.  Update: The only reference I found regarding writing your own Roo addon is at http://www.slideshare.net/desmax74/spring-roo-internals-javaday-iv and the slides leave a lot to the imagination of the reader.
  • Rumors in the Spring forums are that Spring Faces is going to be a new top level project, freed from the grasp of Spring Web Flow, and that the next version (due in June 2010?) will support JSF 2.  I was quite excited to use Web Flow with JSF 2 for the bean management relating to flows.  For instance, a new whatsit wizard that spans multiple pages and manages the scope of the new whatsit bean.
  • Because JSF doesn’t work so well with Spring MVC without Web Flow, and Web Flow won’t support JSF 2 until Spring Faces is split out and both Web Flow and Spring Faces release their next version I believe it’ll be months at best before we see any JSF 2 capabilities within Roo.  The only way it could happen sooner is if someone dives into the Roo source code and creates a third-party addon that will generate JSF 2 controllers and xhtml pages.

So, there it is.  At least 3 days of pain, trial and error, and research all wrapped up in one blog post.

Blog at WordPress.com.