Java Numerical Library 7.1 has just been released and I’m excited to announce that it gives you a better, faster way to solve complex linear programming problems.
Our new Sparse Linear Programming solver lets you input data in sparse matrix format, so only the non-zero elements are stored. When the solver runs, it uses only these elements, taking up much less memory space and executing much faster. In some of our test cases, these algorithms ran to completion in a matter of seconds instead of hours with the same data sets.
For example, say you’re a popular video streaming service and want to serve up content to your users as fast as possible, perhaps based on user ratings and geographic location. Not all users rate all movies, so you have this problem where a relatively small amount of data is available and you need to decide the optimal storage format and location of movies across geographically separated servers. How does your algorithm use this limited data to choose the best format and locations that serve media up effectively so that someone in California isn’t waiting for media to stream from Asia?
This is where sparse matrices come in. As you may know, for large linear programming problems involving many, many variables, constraint matrices are typically extremely sparse, as in this example. If this data is optimized using a typical Dense Linear Programming solver, you can get a lot of data thrashing and wait a very long time for the results. Or lack of results if your user gets frustrated and quits.
Using less memory makes your application more efficient and taking less time frees you up to tackle the bigger, more important problems. When working with optimization problems, it always make sense to work optimized yourself, and this new release of JMSL 7.1 will help you do exactly that.