RangeDependenciesAnalysed
From APIDesign
When analysing dependencies it can be easily proven that RangeDependencies (as used by OSGi) are too flexible and may lead to NP-Complete problems. However recently I started to ask following question - it is known that LibraryReExportIsNPComplete and that the problem can be fixed by generating so called complete repositories. Can't the same trick be applied to repositories using RangeDependencies as well?
My current feeling is that it can. This page is my attempt to formalize the feeling to something more material.
Lower Bound
Each dependency has a lower bound, a minimal required version of a library that we can work against. When producing our application, it is in our interest to verify that we can really compile and create our application against the lower bound. To make sure we are using only classes and methods available in this such lower bound version. The easiest way to ensure this is to compile against the lower bound version of each library we define a dependency on.
Or from the opposite perspective: Compiling against newer version than the lower bound is too errorprone. NetBeans allows this and almost always this leads to LinkageErrors that render the whole application unusable at random and unexpected moments.
Lower bound is the version that we compile against.
Upper bound
Upper bound on the other hand is the version that we know we can still work with.
There are just two vague terms in the previous sentence. What does that mean to know?
- Know can mean to be sure. E.g. we have verified that we really work with that version. We have downloaded that version of a library, executed all our tests and certified it. We really know our application works with that version. In some situations this kind of knowledge is required.
- Know can however mean to believe. Often we can trust the producer of the library, that they will conform to some versioning scheme (like Semantic versioning) and based on such trust we can estimate the upper bound.
What does it mean to work?
- Work can mean really runs as expected and this kind of verification is hard, tough to automate. It requires work of a quality department and is more a process decision than something scientifically definable.
- Work can be reduced to links, this is definitely much less satisfying answer (but at least it prevents the linkage errors, so it gives the application a chance to recover), but it is easily measurable. BinaryCompatibility can be verified by tools like Sigtest.
In case we want to believe our application will sort of work and for sure at least link to new version, it is better to use predefined, widely accepted RangeDependencies - e.g. when using a library with lower bound 1.5, the upper bound would automatically be 2.0. The range would be [1.5,2.0).
If one wants to be sure application really runs as expected, it is wiser to specify narrow range. If version 1.5 is the newest version, one can even use [1.5,1.5] and only when new versions are released, release new versions of the application and expand the range to [1.5,1.6].
The latter, paranoiac approach, is likely to be used more often in end user applications. The first, optimistic one, seems more suitable for reusable libraries.