6 Commits
v1.3 ... v1.4

5 changed files with 155 additions and 62 deletions

View File

@@ -29,17 +29,13 @@ Unfortunately, it's a fairly new algorithm, and not yet implemented by the graph
So this program instead uses the Fibonacci heap-based algorithm of Fredman and Tarjan (1987), which has a worst-case
runtime of **O(n (n log(n) + m))**. The algorithm is implemented as described in Melhorn and Näher (1999).
The current version of the program uses a pairing heap instead of a Fibonacci heap for its priority queue,
which has lower theoretical efficiency but also lower complexity overhead, and is often equivalently performant
in practice.
## USAGE
### RUNNING THE PROGRAM
[Download the current version of BiGpairSEQ_Sim.](https://gitea.ejsf.synology.me/efischer/BiGpairSEQ/releases)
BiGpairSEQ_Sim is an executable .jar file. Requires Java 11 or higher. [OpenJDK 17](https://jdk.java.net/17/)
BiGpairSEQ_Sim is an executable .jar file. Requires Java 14 or higher. [OpenJDK 17](https://jdk.java.net/17/)
recommended.
Run with the command:
@@ -82,16 +78,17 @@ These files are often generated in sequence. When entering filenames, it is not
(.csv or .ser). When reading or writing files, the program will automatically add the correct extension to any filename without one.
To save file I/O time, the most recent instance of each of these four
files either generated or read from disk can be cached in program memory. This is especially important for Graph/Data files,
files either generated or read from disk can be cached in program memory. This is could be important for Graph/Data files,
which can be several gigabytes in size. Since some simulations may require running multiple,
differently-configured BiGpairSEQ matchings on the same graph, keeping the most recent graph cached can reduce execution time
differently-configured BiGpairSEQ matchings on the same graph, keeping the most recent graph cached may reduce execution time.
(The manipulation necessary to re-use a graph incurs its own performance overhead, though, which may scale with graph
size faster than file I/O does. If so, caching is best for smaller graphs.)
Subsequent uses of the same data file won't need to be read in again until another file of that type is used or generated,
When caching is active, subsequent uses of the same data file won't need to be read in again until another file of that type is used or generated,
or caching is turned off for that file type. The program checks whether it needs to update its cached data by comparing
filenames as entered by the user. On encountering a new filename, the program flushes its cache and reads in the new file.
The program's caching behavior can be controlled in the Options menu. By default, caching for cell sample and
sample plate files is OFF, and caching for graph/data files is ON.
The program's caching behavior can be controlled in the Options menu. By default, all caching is OFF.
#### Cell Sample Files
Cell Sample files consist of any number of distinct "T cells." Every cell contains
@@ -256,7 +253,8 @@ slightly less time than the simulation itself. Real elapsed time from start to f
* ~~Try invoking GC at end of workloads to reduce paging to disk~~ DONE
* Hold graph data in memory until another graph is read-in? ~~ABANDONED~~ ~~UNABANDONED~~ DONE
* ~~*No, this won't work, because BiGpairSEQ simulations alter the underlying graph based on filtering constraints. Changes would cascade with multiple experiments.*~~
* Might have figured out a way to do it, by taking edges out and then putting them back into the graph. This may actually be possible. If so, awesome.
* Might have figured out a way to do it, by taking edges out and then putting them back into the graph. This may actually be possible.
* It is possible, though the modifications to the graph incur their own performance penalties. Need testing to see which option is best.
* See if there's a reasonable way to reformat Sample Plate files so that wells are columns instead of rows.
* ~~Problem is variable number of cells in a well~~
* ~~Apache Commons CSV library writes entries a row at a time~~
@@ -270,9 +268,10 @@ slightly less time than the simulation itself. Real elapsed time from start to f
* Re-implement CDR1 matching method
* Implement Duan and Su's maximum weight matching algorithm
* Add controllable algorithm-type parameter?
* Test whether pairing heap (currently used) or Fibonacci heap is more efficient for priority queue in current matching algorithm
* in theory Fibonacci heap should be more efficient, but complexity overhead may eliminate theoretical advantage
* Add controllable heap-type parameter?
* ~~Test whether pairing heap (currently used) or Fibonacci heap is more efficient for priority queue in current matching algorithm~~ DONE
* ~~in theory Fibonacci heap should be more efficient, but complexity overhead may eliminate theoretical advantage~~
* ~~Add controllable heap-type parameter?~~
* Parameter implemented. For large graphs, Fibonacci heap wins. Now the new default.

View File

@@ -12,7 +12,8 @@ public class BiGpairSEQ {
private static String graphFilename = null;
private static boolean cacheCells = false;
private static boolean cachePlate = false;
private static boolean cacheGraph = true;
private static boolean cacheGraph = false;
private static String priorityQueueHeapType = "FIBONACCI";
public static void main(String[] args) {
if (args.length == 0) {
@@ -151,4 +152,16 @@ public class BiGpairSEQ {
}
BiGpairSEQ.cacheGraph = cacheGraph;
}
public static String getPriorityQueueHeapType() {
return priorityQueueHeapType;
}
public static void setPairingHeap() {
priorityQueueHeapType = "PAIRING";
}
public static void setFibonacciHeap() {
priorityQueueHeapType = "FIBONACCI";
}
}

View File

@@ -6,59 +6,74 @@ import java.util.List;
import java.util.Map;
import java.util.Set;
public abstract class GraphModificationFunctions {
public interface GraphModificationFunctions {
//remove over- and under-weight edges
public static List<Integer[]> filterByOverlapThresholds(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
int low, int high) {
static List<Integer[]> filterByOverlapThresholds(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
int low, int high, boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
if ((graph.getEdgeWeight(e) > high) || (graph.getEdgeWeight(e) < low)) {
if(saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer weight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, weight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
//Remove edges for pairs with large occupancy discrepancy
public static List<Integer[]> filterByRelativeOccupancy(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
static List<Integer[]> filterByRelativeOccupancy(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer maxOccupancyDifference) {
Integer maxOccupancyDifference, boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
Integer betaOcc = betaWellCounts.get(plateVtoBMap.get(graph.getEdgeTarget(e)));
if (Math.abs(alphaOcc - betaOcc) >= maxOccupancyDifference) {
if (saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer weight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, weight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
//Remove edges for pairs where overlap size is significantly lower than the well occupancy
public static List<Integer[]> filterByOverlapPercent(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
static List<Integer[]> filterByOverlapPercent(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer minOverlapPercent) {
Integer minOverlapPercent,
boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
@@ -66,20 +81,27 @@ public abstract class GraphModificationFunctions {
double weight = graph.getEdgeWeight(e);
double min = minOverlapPercent / 100.0;
if ((weight / alphaOcc < min) || (weight / betaOcc < min)) {
if(saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer intWeight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, intWeight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
public static void addRemovedEdges(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
static void addRemovedEdges(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
List<Integer[]> removedEdges) {
for (Integer[] edge : removedEdges) {
DefaultWeightedEdge e = graph.addEdge(edge[0], edge[1]);

View File

@@ -38,10 +38,10 @@ public class InteractiveInterface {
case 3 -> makeCDR3Graph();
case 4 -> matchCDR3s();
//case 6 -> matchCellsCDR1();
case 8 -> options();
case 8 -> mainOptions();
case 9 -> acknowledge();
case 0 -> quit = true;
default -> throw new InputMismatchException("Invalid input.");
default -> System.out.println("Invalid input.");
}
} catch (InputMismatchException | IOException ex) {
System.out.println(ex);
@@ -493,13 +493,14 @@ public class InteractiveInterface {
// }
// }
private static void options(){
private static void mainOptions(){
boolean backToMain = false;
while(!backToMain) {
System.out.println("\n--------------OPTIONS---------------");
System.out.println("1) Turn " + getOnOff(!BiGpairSEQ.cacheCells()) + " cell sample file caching");
System.out.println("2) Turn " + getOnOff(!BiGpairSEQ.cachePlate()) + " plate file caching");
System.out.println("3) Turn " + getOnOff(!BiGpairSEQ.cacheGraph()) + " graph/data file caching");
System.out.println("4) Maximum weight matching algorithm options");
System.out.println("0) Return to main menu");
try {
input = sc.nextInt();
@@ -507,8 +508,9 @@ public class InteractiveInterface {
case 1 -> BiGpairSEQ.setCacheCells(!BiGpairSEQ.cacheCells());
case 2 -> BiGpairSEQ.setCachePlate(!BiGpairSEQ.cachePlate());
case 3 -> BiGpairSEQ.setCacheGraph(!BiGpairSEQ.cacheGraph());
case 4 -> algorithmOptions();
case 0 -> backToMain = true;
default -> throw new InputMismatchException("Invalid input.");
default -> System.out.println("Invalid input");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
@@ -517,11 +519,49 @@ public class InteractiveInterface {
}
}
/**
* Helper function for printing menu items in mainOptions(). Returns a string based on the value of parameter.
*
* @param b - a boolean value
* @return String "on" if b is true, "off" if b is false
*/
private static String getOnOff(boolean b) {
if (b) { return "on";}
else { return "off"; }
}
private static void algorithmOptions(){
boolean backToOptions = false;
while(!backToOptions) {
System.out.println("\n---------ALGORITHM OPTIONS----------");
System.out.println("1) Use scaling algorithm by Duan and Su.");
System.out.println("2) Use LEDA book algorithm with Fibonacci heap priority queue");
System.out.println("3) Use LEDA book algorithm with pairing heap priority queue");
System.out.println("0) Return to Options menu");
try {
input = sc.nextInt();
switch (input) {
case 1 -> System.out.println("This option is not yet implemented. Choose another.");
case 2 -> {
BiGpairSEQ.setFibonacciHeap();
System.out.println("MWM algorithm set to LEDA with Fibonacci heap");
backToOptions = true;
}
case 3 -> {
BiGpairSEQ.setPairingHeap();
System.out.println("MWM algorithm set to LEDA with pairing heap");
backToOptions = true;
}
case 0 -> backToOptions = true;
default -> System.out.println("Invalid input");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
}
}
private static void acknowledge(){
System.out.println("This program simulates BiGpairSEQ, a graph theory based adaptation");
System.out.println("of the pairSEQ algorithm for pairing T cell receptor sequences.");

View File

@@ -3,6 +3,7 @@ import org.jgrapht.alg.matching.MaximumWeightBipartiteMatching;
import org.jgrapht.generate.SimpleWeightedBipartiteGraphMatrixGenerator;
import org.jgrapht.graph.DefaultWeightedEdge;
import org.jgrapht.graph.SimpleWeightedGraph;
import org.jheaps.tree.FibonacciHeap;
import org.jheaps.tree.PairingHeap;
import java.math.BigDecimal;
@@ -16,7 +17,7 @@ import java.util.stream.IntStream;
import static java.lang.Float.*;
//NOTE: "sequence" in method and variable names refers to a peptide sequence from a simulated T cell
public class Simulator {
public class Simulator implements GraphModificationFunctions {
private static final int cdr3AlphaIndex = 0;
private static final int cdr3BetaIndex = 1;
private static final int cdr1AlphaIndex = 2;
@@ -146,8 +147,8 @@ public class Simulator {
Integer highThreshold, Integer maxOccupancyDifference,
Integer minOverlapPercent, boolean verbose) {
Instant start = Instant.now();
//Integer arrays will contain TO VERTEX, FROM VERTEX, and WEIGHT (which I'll need to cast to double)
List<Integer[]> removedEdges = new ArrayList<>();
boolean saveEdges = BiGpairSEQ.cacheGraph();
int numWells = data.getNumWells();
Integer alphaCount = data.getAlphaCount();
Integer betaCount = data.getBetaCount();
@@ -160,33 +161,50 @@ public class Simulator {
//remove edges with weights outside given overlap thresholds, add those to removed edge list
if(verbose){System.out.println("Eliminating edges with weights outside overlap threshold values");}
removedEdges.addAll(GraphModificationFunctions.filterByOverlapThresholds(graph, lowThreshold, highThreshold));
removedEdges.addAll(GraphModificationFunctions.filterByOverlapThresholds(graph, lowThreshold, highThreshold, saveEdges));
if(verbose){System.out.println("Over- and under-weight edges removed");}
//remove edges between vertices with too small an overlap size, add those to removed edge list
if(verbose){System.out.println("Eliminating edges with weights less than " + minOverlapPercent.toString() +
" percent of vertex occupancy value.");}
removedEdges.addAll(GraphModificationFunctions.filterByOverlapPercent(graph, alphaWellCounts, betaWellCounts,
plateVtoAMap, plateVtoBMap, minOverlapPercent));
plateVtoAMap, plateVtoBMap, minOverlapPercent, saveEdges));
if(verbose){System.out.println("Edges with weights too far below a vertex occupancy value removed");}
//Filter by relative occupancy
if(verbose){System.out.println("Eliminating edges between vertices with occupancy difference > "
+ maxOccupancyDifference);}
removedEdges.addAll(GraphModificationFunctions.filterByRelativeOccupancy(graph, alphaWellCounts, betaWellCounts,
plateVtoAMap, plateVtoBMap, maxOccupancyDifference));
plateVtoAMap, plateVtoBMap, maxOccupancyDifference, saveEdges));
if(verbose){System.out.println("Edges between vertices of with excessively different occupancy values " +
"removed");}
//Find Maximum Weighted Matching
//using jheaps library class PairingHeap for improved efficiency
if(verbose){System.out.println("Finding maximum weighted matching");}
//Attempting to use addressable heap to improve performance
MaximumWeightBipartiteMatching maxWeightMatching =
new MaximumWeightBipartiteMatching(graph,
MaximumWeightBipartiteMatching maxWeightMatching;
//Use correct heap type for priority queue
String heapType = BiGpairSEQ.getPriorityQueueHeapType();
switch (heapType) {
case "PAIRING" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(),
plateVtoBMap.keySet(),
i -> new PairingHeap(Comparator.naturalOrder()));
}
case "FIBONACCI" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(),
plateVtoBMap.keySet(),
i -> new FibonacciHeap(Comparator.naturalOrder()));
}
default -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(),
plateVtoBMap.keySet());
}
}
//get the matching
MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = maxWeightMatching.getMatching();
if(verbose){System.out.println("Matching completed");}
Instant stop = Instant.now();
@@ -292,10 +310,11 @@ public class Simulator {
}
}
if(saveEdges) {
//put the removed edges back on the graph
System.out.println("Restoring removed edges to graph.");
GraphModificationFunctions.addRemovedEdges(graph, removedEdges);
}
//return MatchingResult object
return output;
}