RangeDependenciesAnalysed

From APIDesign

Revision as of 12:36, 7 January 2012 by JaroslavTulach (Talk | contribs)
Jump to: navigation, search

When analysing dependencies it can be easily proven that RangeDependencies (as used by OSGi) are too flexible and may lead to NP-Complete problems. However recently I started to ask following question - it is known that LibraryReExportIsNPComplete and that the problem can be fixed by generating so called complete repositories. Can't the same trick be applied to repositories using RangeDependencies as well?

My current feeling is that it can. This page is my attempt to formalize the feeling to something more material.

Lower Bound

Each dependency has a lower bound, a minimal required version of a library that we can work against. When producing our application, it is in our interest to verify that we can really compile and create our application against the lower bound. To make sure we are using only classes and methods available in this such lower bound version. The easiest way to ensure this is to compile against the lower bound version of each library we define a dependency on.

Or from the opposite perspective: Compiling against newer version than the lower bound is too errorprone. NetBeans allows this and almost always this leads to LinkageErrors that render the whole application unusable at random and unexpected moments.

Lower bound is the version that we compile against.

Upper bound

Upper bound on the other hand is the version that we know we can still work with.

There are just two vague terms in the previous sentence. What does that mean to know?

  • Know can mean to be sure. E.g. we have verified that we really work with that version. We have downloaded that version of a library, executed all our tests and certified it. We really know our application works with that version. In some situations this kind of knowledge is required.
  • Know can however mean to believe. Often we can trust the produce of the library, that they will conform to some versioning scheme (like Semantic versioning) and based on such trust we can estimate the upper bound.

What does it mean to work?

  • Work can mean really runs as expected and this kind of verification is hard, tough to automate. It requires work of a quality department and is more a process decision than something scientifically definable.
  • Work can be reduced to links, this is definitely much less satisfying answer (but at least it prevents the linkage errors, so it gives the application a chance to recover), but it is easily measurable. BinaryCompatibility can be verified by tools like Sigtest.
Personal tools
buy