Final interface
From APIDesign
(→Enforcing final interface During Compilation) |
(→Enforcing final interface During Compilation) |
||
Line 68: | Line 68: | ||
TypeElement te = (TypeElement) e; | TypeElement te = (TypeElement) e; | ||
/* exception for the only known implementation: | /* exception for the only known implementation: | ||
- | if ("org.apidesign.demo.finalinterface.AllowedImplementationTest".equals(te.getQualifiedName().toString())) continue; | + | if ("org.apidesign.demo.finalinterface.AllowedImplementationTest".equals( |
+ | te.getQualifiedName().toString()) | ||
+ | ) continue; | ||
*/ | */ | ||
for (TypeMirror m : te.getInterfaces()) { | for (TypeMirror m : te.getInterfaces()) { |
Revision as of 16:56, 4 February 2015
Final interface is a pattern often used in vendor library style API design. It marks a Java interface in an API as final (either in Javadoc or in better places) with the evolution plan to expand it incompatibly (from the point of implementers). The hope is that nobody except the implementers will ever implement such interface.
Contents |
Why it does not work?
DOM2 vs. DOM3 problems are famous. The interfaces in DOM Java API were made in the final interface style, and as the XML specification was still evolving it soon turned out the original interfaces are not satisfactory. The XML for example introduced namespaces and the Java DOM API needed to adopt to it. One may feel this is what Final interfaces well design for! Just add few methods to them and where is the problem? Well, you break backward compatibility for those who implement the interface - and there were many DOM2 parsers, as at certain point in time it was very popular to write own's XML parser.
If one worked only with the standard XML parser provided by the JDK itself together with the DOM API - everything worked fine. Of course, because of the closest possible proximity! When you package your API together with (the only) implementation you don't have evolution and versioning problems - the proximity is so intimate, you don't have to think about versioning.
However most of the more complex Java applications were not satisfied with the default Java parser and needed to include different implementation. And hence the problems began - when one had implementation of DOM3 provided as a library, but the DOM2 API provided by the JDK, the linkage problems were endless. JDK's distribution of DOM2 and parsers and applications relying on DOM3 (which contains incompatible interfaces from provider point of view) just created a unsolvable mess.
When it works?
The above problems can be slightly mitigated if one has good runtime support for modularity and this may be the reason why the vendor library seems to be very popular in OSGi world. If the version range used by clients is wider than version range used by implementations proper versioning is possible and the OSGi container can select the right modules to compose a working system. More about this in the proximity essay. However it needs to be stated that this works only in One to Many and in Few to Many mode - e.g. when there is a single provider (or few of them) of the DOM parser and many users of the DOM parsing API. Then one can use OSGi RangeDependencies to make sure the implementation has closer proximity than users of the DOM API.
Once you end up with multiple DOM parser implementations in your application (like NetBeans - a large and modular application - did) - e.g. you enter Many to Many relationship, no close proximity is going to save you. The only saving point is to adhere to best API Design practice and separate ClientAPI from ProviderAPIs. As such I am going to include final interface in API Design anti patterns although I am sure OSGi friends will never try to understand such better alternative and stick to their own old guns.
JDK: Scratching Own Itch!
Of course most of this discussion is becoming obsolete with JDK8 extender methods. With JDK8 you can add new methods with default implementations into existing interfaces - which is then source and binary compatible extension. Reasonable API writers will then provide the default bodies for newly added methods and mitigate the DOM2 vs. DOM3 problems by that.
It is however interesting to analyze the reasons why extender methods were introduced. Of course, people we crying for them for ages, but the members of JDK team were refusing all such efforts. Why? To keep the purity of the language.
However when they finally felt all the pain - e.g. when the Collection and etc. interfaces were found insufficient, they immediately changed their mind. And of course, because they control the VM, they could do (unlike us, regular outsider beings) miracles. It is way easier to resolve backward compatibility issues if you control the way linkage is done!
All that is needed is to motivate owner of HotSpot to implement some small enhancement. That however requires the owner to feel the pain - e.g. simulate common problem in context of JDK - that is however tough - JDK is specific isolated project and problems they face are often too different from real world ones. But when a rare situation happens (e.g. JDK feels the same problem as rest of the world), scratching own itch can really move things forward!
Enforcing final interface During Compilation
There is a way to turn the don't implement me, please Javadoc warning into something real, that will open eyes of everyone who tries to implement a final interface. There is a way to fail one's build in case of violation of the advice! The check can be done with a simple AnnotationProcessor. When you have your final interface:
package org.apidesign.demo.finalinterface; public interface FinalInterface { public int dontImplementMe(); }
accompany its definition in a JAR file with an AnnotationProcessor. The processor will be executed every time somebody includes your JAR file on compilation classpath and can check whether there is a violation of the final interface contract:
package org.apidesign.demo.finalinterface; import java.util.Collection; import java.util.Set; import javax.annotation.processing.AbstractProcessor; import javax.annotation.processing.Processor; import javax.annotation.processing.RoundEnvironment; import javax.annotation.processing.SupportedAnnotationTypes; import javax.lang.model.element.Element; import javax.lang.model.element.TypeElement; import javax.lang.model.type.TypeMirror; import javax.tools.Diagnostic; import org.openide.util.lookup.ServiceProvider; @ServiceProvider(service = Processor.class) @SupportedAnnotationTypes("*") public final class FinalEnforcingProcessor extends AbstractProcessor { @Override public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) { checkForViolations(roundEnv.getRootElements()); return true; } private void checkForViolations(Collection<? extends Element> all) { for (Element e : all) { if (e instanceof TypeElement) { TypeElement te = (TypeElement) e; /* exception for the only known implementation: if ("org.apidesign.demo.finalinterface.AllowedImplementationTest".equals( te.getQualifiedName().toString()) ) continue; */ for (TypeMirror m : te.getInterfaces()) { if (FinalInterface.class.getName().equals(m.toString())) { processingEnv.getMessager().printMessage( Diagnostic.Kind.ERROR, "Cannot implement FinalInterface", e ); } } } checkForViolations(e.getEnclosedElements()); } } }
A compile check of this kind makes the API Designer in me way more happier! Users of your API are may be clueless (as I always claim) and may not bother reading documentation, but none of them can ignore failing build!
A final interface accompanied by an AnnotationProcessor is finally real API Design Pattern!