27 Commits

Author SHA1 Message Date
eugenefischer
58cdf9ae93 Lookback AA implementation, doesn't currently work 2023-04-09 20:45:03 -05:00
eugenefischer
202ad4c834 mention forward/reverse auction algorithms 2023-04-09 20:42:58 -05:00
eugenefischer
96d49d0034 clarifying comment 2023-04-09 19:48:43 -05:00
eugenefischer
d8e5f7ece0 update todo 2023-04-09 13:00:41 -05:00
eugenefischer
9c81d919b4 add disclosure section 2023-01-18 16:28:16 -06:00
eugenefischer
70b08e7c22 Bugfixes and streamlining 2022-10-22 17:59:01 -05:00
eugenefischer
44158d264c Correct sequence count 2022-10-22 16:16:32 -05:00
eugenefischer
e97c2989db Add dropout rate calculation to read-in of data from plate file (this may slow down read-in by a lot) 2022-10-22 16:04:41 -05:00
eugenefischer
f7709ada73 Change order of metadata comments 2022-10-22 15:50:35 -05:00
eugenefischer
25b37eff48 renamed to MaximumIntegerWeightBipartiteAuctionMatching 2022-10-22 15:00:22 -05:00
eugenefischer
fbbb5a8792 Update comments 2022-10-22 14:59:43 -05:00
eugenefischer
4b9d7f8494 Add option to select matching algorithm type, rename types in output 2022-10-22 14:59:24 -05:00
eugenefischer
0de12a3a12 Refactor to use selected algorithm type 2022-10-22 14:58:40 -05:00
eugenefischer
3c2ec9002e Add field for algorithm type, methods to set algorithm type 2022-10-22 14:13:31 -05:00
eugenefischer
bcf3af5a83 Update algorithm type names 2022-10-22 14:10:00 -05:00
eugenefischer
fcca22a2f0 Rename class, modify bidding to include marginal item value 2022-10-22 13:18:43 -05:00
eugenefischer
910de0ce9d Fix typos 2022-10-21 13:46:10 -05:00
eugenefischer
ef349ea5f6 Correctly store matching weight 2022-10-14 18:44:56 -05:00
eugenefischer
174db66c46 Clean up comments 2022-10-14 18:31:32 -05:00
eugenefischer
b3273855a6 Test simpler source/target differentiation 2022-10-14 18:11:21 -05:00
eugenefischer
51c1bc2551 Skip edges with zero weight 2022-10-14 18:09:34 -05:00
eugenefischer
f7d522e95d Comment out old MWM algorithm, add auction algorithm 2022-10-14 17:38:07 -05:00
eugenefischer
5f0c089b0a add getter for matchingWeight 2022-10-14 17:37:40 -05:00
eugenefischer
d3066095d9 add getter/setter for potential 2022-10-14 17:32:37 -05:00
eugenefischer
55a5d9a892 Making fields final 2022-10-14 17:32:21 -05:00
eugenefischer
49708f2f8a Initial auction algorithm implementation 2022-10-14 17:31:59 -05:00
eugenefischer
c7934ca498 update TODO 2022-10-03 21:30:32 -05:00
10 changed files with 494 additions and 43 deletions

View File

@@ -20,6 +20,7 @@
7. CITATIONS
8. ACKNOWLEDGEMENTS
9. AUTHOR
10. DISCLOSURE
## ABOUT
@@ -56,10 +57,6 @@ a pairSEQ experiment is bipartite with integer weights, this algorithm seems ide
fairly new algorithm, and not yet implemented by the graph theory library used in this simulator (JGraphT), nor has the author had
time to implement it himself.
There have been some studies which show that [auction algorithms](https://en.wikipedia.org/wiki/Auction_algorithm) for the assignment problem can have superior performance in
real-world implementations, due to their simplicity, than more complex algorithms with better theoretical asymptotic
performance. But, again, there is no such algorithms implemented by JGraphT, nor has the author yet had time to implement one.
So this program instead uses the [Fibonacci heap](https://en.wikipedia.org/wiki/Fibonacci_heap) based algorithm of Fredman and Tarjan (1987) (essentially
[the Hungarian algorithm](https://en.wikipedia.org/wiki/Hungarian_algorithm) augmented with a more efficeint priority queue) which has a worst-case
runtime of **O(n (n log(n) + m))**. The algorithm is implemented as described in Melhorn and Näher (1999). (The simulator
@@ -74,6 +71,15 @@ be balanced assignment problems, in practice sequence dropout can cause them to
the Hungarian algorithm, graph doubling--which could be challenging with the computational resources available to the
author--has not yet been necessary.
There have been some studies which show that [auction algorithms](https://en.wikipedia.org/wiki/Auction_algorithm) for the assignment problem can have superior performance in
real-world implementations, due to their simplicity, than more complex algorithms with better theoretical asymptotic
performance. The author has implemented a basic forward auction algorithm, which produces optimal assignment for unbalanced bipartite graphs with
integer weights. To allow for unbalanced assignment, this algorithim eschews epsilon-scaling,
and as a result is prone to "bidding-wars" which increase run time, making it less efficient than the implementation of
the Fredman-Tarjan algorithm in JGraphT. A forward/reverse auction algorithm as developed by Bertsekas and Castañon
should be able to handle unbalanced (or, as they call it, asymmetric) assignment much more efficiently, but has yet to be
implemented.
The relative time/space efficiencies of BiGpairSEQ when backed by different MWM algorithms remains an open problem.
## THE BiGpairSEQ ALGORITHM
@@ -614,15 +620,16 @@ the file of distinct cells may enable better simulated replication of this exper
* ~~Update graphml output to reflect current Vertex class attributes~~ DONE
* Individual well data from the SequenceRecords could be included, if there's ever a reason for it
* ~~Implement simulation of sequences being misread as other real sequence~~ DONE
* Implement redistributive heap for LEDA matching algorithm to achieve theoretical worst case of O(n(m + n log C)) where C is highest edge weight.
* Update matching metadata output options in CLI
* Add frequency distribution details to metadata output
* need to make an enum for the different distribution types and refactor the Plate class and user interfaces, also add the necessary fields to GraphWithMapData and then call if from Simulator
* Update performance data in this readme
* Add section to ReadMe describing data filtering methods.
* Re-implement CDR1 matching method
* Refactor simulator code to collect all needed data in a single scan of the plate
* Currently it scans once for the vertices and then again for the edge weights. This made simulating read depth awkward, and incompatible with caching of plate files.
* This would be a fairly major rewrite of the simulator code, but could make things faster, and would definitely make them cleaner.
* ~~Refactor simulator code to collect all needed data in a single scan of the plate~~ DONE
* ~~Currently it scans once for the vertices and then again for the edge weights. This made simulating read depth awkward, and incompatible with caching of plate files.~~
* ~~This would be a fairly major rewrite of the simulator code, but could make things faster, and would definitely make them cleaner.~~
* Implement Duan and Su's maximum weight matching algorithm
* Add controllable algorithm-type parameter?
* This would be fun and valuable, but probably take more time than I have for a hobby project.
@@ -649,8 +656,18 @@ the file of distinct cells may enable better simulated replication of this exper
* [Apache Commons CLI](https://commons.apache.org/proper/commons-cli/) -- To enable command line arguments for scripting.
## ACKNOWLEDGEMENTS
BiGpairSEQ was conceived in collaboration with Dr. Alice MacQueen, who brought the original
BiGpairSEQ was conceived in collaboration with the author's spouse, Dr. Alice MacQueen, who brought the original
pairSEQ paper to the author's attention and explained all the biology terms he didn't know.
## AUTHOR
BiGpairSEQ algorithm and simulation by Eugene Fischer, 2021. Improvements and documentation, 2022.
BiGpairSEQ algorithm and simulation by Eugene Fischer, 2021. Improvements and documentation, 20222023.
## DISCLOSURE
The earliest versions of the BiGpairSEQ simulator were written in 2021 to let Dr. MacQueen test hypothetical extensions
of the published pairSEQ protocol while she was interviewing for a position at Adaptive Biotechnologies. She has been
employed at Adaptive Biotechnologies since 2022.
The author has worked on this BiGpairSEQ simulator since 2021 without Dr. MacQueen's involvement, since she has had
access to related, proprietary technologies. The author has had no such access, relying exclusively on the 2015 pairSEQ
paper and other academic publications. He continues to work on BiGpairSEQ
recreationally, as it involves some very beautiful math.

View File

@@ -0,0 +1,4 @@
public enum AlgorithmType {
HUNGARIAN, //Hungarian algorithm
AUCTION, //Forward auction algorithm
}

View File

@@ -13,6 +13,7 @@ public class BiGpairSEQ {
private static boolean cacheCells = false;
private static boolean cachePlate = false;
private static boolean cacheGraph = false;
private static AlgorithmType matchingAlgoritmType = AlgorithmType.HUNGARIAN;
private static HeapType priorityQueueHeapType = HeapType.FIBONACCI;
private static boolean outputBinary = true;
private static boolean outputGraphML = false;
@@ -108,7 +109,6 @@ public class BiGpairSEQ {
return graphFilename;
}
public static boolean cacheCells() {
return cacheCells;
}
@@ -157,10 +157,16 @@ public class BiGpairSEQ {
BiGpairSEQ.cacheGraph = cacheGraph;
}
public static String getPriorityQueueHeapType() {
return priorityQueueHeapType.name();
public static HeapType getPriorityQueueHeapType() {
return priorityQueueHeapType;
}
public static AlgorithmType getMatchingAlgoritmType() { return matchingAlgoritmType; }
public static void setHungarianAlgorithm() { matchingAlgoritmType = AlgorithmType.HUNGARIAN; }
public static void setAuctionAlgorithm() { matchingAlgoritmType = AlgorithmType.AUCTION; }
public static void setPairingHeap() {
priorityQueueHeapType = HeapType.PAIRING;
}

View File

@@ -583,8 +583,9 @@ public class InteractiveInterface {
while(!backToOptions) {
System.out.println("\n---------ALGORITHM OPTIONS----------");
System.out.println("1) Use scaling algorithm by Duan and Su.");
System.out.println("2) Use LEDA book algorithm with Fibonacci heap priority queue");
System.out.println("3) Use LEDA book algorithm with pairing heap priority queue");
System.out.println("2) Use Hungarian algorithm with Fibonacci heap priority queue");
System.out.println("3) Use Hungarian algorithm with pairing heap priority queue");
System.out.println("4) Use auction algorithm");
System.out.println("0) Return to Options menu");
try {
input = sc.nextInt();
@@ -592,14 +593,18 @@ public class InteractiveInterface {
case 1 -> System.out.println("This option is not yet implemented. Choose another.");
case 2 -> {
BiGpairSEQ.setFibonacciHeap();
System.out.println("MWM algorithm set to LEDA with Fibonacci heap");
System.out.println("MWM algorithm set to Hungarian with Fibonacci heap");
backToOptions = true;
}
case 3 -> {
BiGpairSEQ.setPairingHeap();
System.out.println("MWM algorithm set to LEDA with pairing heap");
System.out.println("MWM algorithm set to Hungarian with pairing heap");
backToOptions = true;
}
case 4 -> {
BiGpairSEQ.setAuctionAlgorithm();
System.out.println("MWM algorithm set to auction");
}
case 0 -> backToOptions = true;
default -> System.out.println("Invalid input");
}

View File

@@ -0,0 +1,176 @@
import org.jgrapht.Graph;
import org.jgrapht.GraphTests;
import org.jgrapht.alg.interfaces.MatchingAlgorithm;
import java.math.BigDecimal;
import java.util.*;
/**
* Maximum weight matching in bipartite graphs with strictly integer edge weights, using a forward auction algorithm.
* This implementation uses the Gauss-Seidel version of the forward auction algorithm, in which bids are submitted
* one at a time. For any weighted bipartite graph with n vertices in the smaller partition, this algorithm will produce
* a matching that is within n*epsilon of being optimal. Using an epsilon = 1/(n+1) ensures that this matching differs
* from an optimal matching by <1. Thus, for a bipartite graph with strictly integer weights, this algorithm returns
* a maximum weight matching.
*
* See:
* "Towards auction algorithms for large dense assignment problems"
* Libor Buš and Pavel Tvrdík, Comput Optim Appl (2009) 43:411-436
* https://link.springer.com/article/10.1007/s10589-007-9146-5
*
* See also:
* Many books and papers by Dimitri Bertsekas, including chapter 4 of Linear Network Optimization:
* https://web.mit.edu/dimitrib/www/LNets_Full_Book.pdf
*
* @param <V> the graph vertex type
* @param <E> the graph edge type
*
* @author Eugene Fischer
*/
public class MaximumIntegerWeightBipartiteAuctionMatching<V, E> implements MatchingAlgorithm<V, E> {
private final Graph<V, E> graph;
private final Set<V> partition1;
private final Set<V> partition2;
private final BigDecimal epsilon;
private final Set<E> matching;
private BigDecimal matchingWeight;
private boolean swappedPartitions = false;
public MaximumIntegerWeightBipartiteAuctionMatching(Graph<V, E> graph, Set<V> partition1, Set<V> partition2) {
this.graph = GraphTests.requireUndirected(graph);
this.partition1 = Objects.requireNonNull(partition1, "Partition 1 cannot be null");
this.partition2 = Objects.requireNonNull(partition2, "Partition 2 cannot be null");
int n = Math.max(partition1.size(), partition2.size());
this.epsilon = BigDecimal.valueOf(1 / ((double) n + 1)); //The minimum price increase of a bid
this.matching = new LinkedHashSet<>();
this.matchingWeight = BigDecimal.ZERO;
}
/*
Method coded using MaximumWeightBipartiteMatching.class from JgraphT as a model
*/
@Override
public Matching<V, E> getMatching() {
/*
* Test input instance
*/
if (!GraphTests.isSimple(graph)) {
throw new IllegalArgumentException("Only simple graphs supported");
}
if (!GraphTests.isBipartitePartition(graph, partition1, partition2)) {
throw new IllegalArgumentException("Graph partition is not bipartite");
}
/*
If the two partitions are different sizes, the bidders must be the smaller of the two partitions.
*/
Set<V> items;
Set<V> bidders;
if (partition2.size() >= partition1.size()) {
bidders = partition1;
items = partition2;
}
else {
bidders = partition2;
items = partition1;
swappedPartitions = true;
}
/*
Create a map to track the owner of each item, which is initially null,
and a map to track the price of each item, which is initially 0. An
Initial price of 0 allows for asymmetric assignment (though does mean
that this form of the algorithm cannot take advantage of epsilon-scaling).
*/
Map<V, V> owners = new HashMap<>();
Map<V, BigDecimal> prices = new HashMap<>();
for(V item: items) {
owners.put(item, null);
prices.put(item, BigDecimal.ZERO);
}
//Create a queue of bidders that don't currently own an item, which is initially all of them
Queue<V> unmatchedBidders = new ArrayDeque<>();
for(V bidder: bidders) {
unmatchedBidders.offer(bidder);
}
//Run the auction while there are remaining unmatched bidders
while (unmatchedBidders.size() > 0) {
V bidder = unmatchedBidders.poll();
V item = null;
BigDecimal bestValue = BigDecimal.valueOf(-1.0);
BigDecimal runnerUpValue = BigDecimal.valueOf(-1.0);
/*
Find the items that offer the best and second-best value for the bidder,
then submit a bid equal to the price of the best-valued item plus the marginal value over
the second-best-valued item plus epsilon.
*/
for (E edge: graph.edgesOf(bidder)) {
double weight = graph.getEdgeWeight(edge);
if(weight == 0.0) {
continue;
}
V tmp = getItem(edge);
BigDecimal value = BigDecimal.valueOf(weight).subtract(prices.get(tmp));
if (value.compareTo(bestValue) >= 0) {
runnerUpValue = bestValue;
bestValue = value;
item = tmp;
}
else if (value.compareTo(runnerUpValue) >= 0) {
runnerUpValue = value;
}
}
if(bestValue.compareTo(BigDecimal.ZERO) >= 0) {
V formerOwner = owners.get(item);
BigDecimal price = prices.get(item);
BigDecimal bid = price.add(bestValue).subtract(runnerUpValue).add(epsilon);
if (formerOwner != null) {
unmatchedBidders.offer(formerOwner);
}
owners.put(item, bidder);
prices.put(item, bid);
}
}
//Add all edges between items and their owners to the matching
for (V item: owners.keySet()) {
if (owners.get(item) != null) {
matching.add(graph.getEdge(item, owners.get(item)));
}
}
//Sum the edges of the matching to obtain the matching weight
for(E edge: matching) {
this.matchingWeight = this.matchingWeight.add(BigDecimal.valueOf(graph.getEdgeWeight(edge)));
}
return new MatchingImpl<>(graph, matching, matchingWeight.doubleValue());
}
private V getItem(E edge) {
if (swappedPartitions) {
return graph.getEdgeSource(edge);
}
else {
return graph.getEdgeTarget(edge);
}
}
// //method for implementing a forward-reverse auction algorithm, not used here
// private V getBidder(E edge) {
// if (swappedPartitions) {
// return graph.getEdgeTarget(edge);
// }
// else {
// return graph.getEdgeSource(edge);
// }
// }
public BigDecimal getMatchingWeight() {
return matchingWeight;
}
}

View File

@@ -0,0 +1,212 @@
import org.jgrapht.Graph;
import org.jgrapht.GraphTests;
import org.jgrapht.alg.interfaces.MatchingAlgorithm;
import org.jgrapht.alg.util.Pair;
import java.math.BigDecimal;
import java.util.*;
/*
Maximum weight matching in bipartite graphs with strictly integer edge weights, found using the
unscaled look-back auction algorithm
*/
public class MaximumWeightBipartiteLookBackAuctionMatching<V, E> implements MatchingAlgorithm<V, E> {
private final Graph<V, E> graph;
private final Set<V> partition1;
private final Set<V> partition2;
private final BigDecimal delta;
private final Set<E> matching;
private BigDecimal matchingWeight;
private boolean swappedPartitions = false;
public MaximumWeightBipartiteLookBackAuctionMatching(Graph<V, E> graph, Set<V> partition1, Set<V> partition2) {
this.graph = GraphTests.requireUndirected(graph);
this.partition1 = Objects.requireNonNull(partition1, "Partition 1 cannot be null");
this.partition2 = Objects.requireNonNull(partition2, "Partition 2 cannot be null");
int n = Math.max(partition1.size(), partition2.size());
this.delta = BigDecimal.valueOf(1 / ((double) n + 1));
this.matching = new LinkedHashSet<>();
this.matchingWeight = BigDecimal.ZERO;
}
/*
Method coded using MaximumWeightBipartiteMatching.class from JgraphT as a model
*/
@Override
public Matching<V, E> getMatching() {
/*
* Test input instance
*/
if (!GraphTests.isSimple(graph)) {
throw new IllegalArgumentException("Only simple graphs supported");
}
if (!GraphTests.isBipartitePartition(graph, partition1, partition2)) {
throw new IllegalArgumentException("Graph partition is not bipartite");
}
/*
If the two partitions are different sizes, the bidders must be the smaller of the two partitions.
*/
Set<V> items;
Set<V> bidders;
if (partition2.size() >= partition1.size()) {
bidders = partition1;
items = partition2;
}
else {
bidders = partition2;
items = partition1;
swappedPartitions = true;
}
/*
Create a map to track the owner of each item, which is initially null,
and a map to track the price of each item, which is initially 0.
*/
Map<V, V> owners = new HashMap<>();
/*
Create a map to track the prices of the objects
*/
Map<V, BigDecimal> prices = new HashMap<>();
for(V item: items) {
owners.put(item, null);
prices.put(item, BigDecimal.ZERO);
}
/*
Create a map to track the most valuable object for a bidder
*/
Map<V, V> mostValuableItems = new HashMap<>();
/*
Create a map to track the second most valuable object for a bidder
*/
Map<V, V> runnerUpItems = new HashMap<>();
/*
Create a map to track the bidder value thresholds
*/
Map<V, BigDecimal> valueThresholds = new HashMap<>();
//Initialize queue of all bidders that don't currently own an item
Queue<V> unmatchedBidders = new ArrayDeque<>();
for(V bidder: bidders) {
unmatchedBidders.offer(bidder);
valueThresholds.put(bidder, BigDecimal.ZERO);
mostValuableItems.put(bidder, null);
runnerUpItems.put(bidder, null);
}
while (unmatchedBidders.size() > 0) {
V bidder = unmatchedBidders.poll();
// BigDecimal valueThreshold = valueThresholds.get(bidder);
BigDecimal bestValue = BigDecimal.ZERO;
BigDecimal runnerUpValue = BigDecimal.ZERO;
boolean reinitialize = true;
// if (mostValuableItems.get(bidder) != null && runnerUpItems.get(bidder) != null) {
// reinitialize = false;
// //get the weight of the edge between the bidder and the best valued item
// V bestItem = mostValuableItems.get(bidder);
// BigDecimal bestItemWeight = BigDecimal.valueOf(graph.getEdgeWeight(graph.getEdge(bidder, bestItem)));
// bestValue = bestItemWeight.subtract(prices.get(bestItem));
// V runnerUpItem = runnerUpItems.get(bidder);
// BigDecimal runnerUpWeight = BigDecimal.valueOf(graph.getEdgeWeight(graph.getEdge(bidder, runnerUpItem)));
// runnerUpValue = runnerUpWeight.subtract(prices.get(runnerUpItem));
// //if both values are still above the threshold
// if (bestValue.compareTo(valueThreshold) >= 0 && runnerUpValue.compareTo(valueThreshold) >= 0) {
// if (bestValue.compareTo(runnerUpValue) < 0) { //if best value is lower than runner up
// BigDecimal tmp = bestValue;
// bestValue = runnerUpValue;
// runnerUpValue = tmp;
// mostValuableItems.put(bidder, runnerUpItem);
// runnerUpItems.put(bidder, bestItem);
// }
// BigDecimal newValueThreshold = bestValue.min(runnerUpValue);
// valueThresholds.put(bidder, newValueThreshold);
// System.out.println("lookback successful");
// }
// else {
// reinitialize = true; //lookback failed
// }
// }
if (reinitialize){
bestValue = BigDecimal.ZERO;
runnerUpValue = BigDecimal.ZERO;
for (E edge: graph.edgesOf(bidder)) {
double weight = graph.getEdgeWeight(edge);
if (weight == 0.0) {
continue;
}
V tmpItem = getItem(bidder, edge);
BigDecimal tmpValue = BigDecimal.valueOf(weight).subtract(prices.get(tmpItem));
if (tmpValue.compareTo(bestValue) >= 0) {
runnerUpValue = bestValue;
bestValue = tmpValue;
runnerUpItems.put(bidder, mostValuableItems.get(bidder));
mostValuableItems.put(bidder, tmpItem);
}
else if (tmpValue.compareTo(runnerUpValue) >= 0) {
runnerUpValue = tmpValue;
runnerUpItems.put(bidder, tmpItem);
}
}
valueThresholds.put(bidder, runnerUpValue);
}
//Should now have initialized the maps to make look back possible
//skip this bidder if the best value is still zero
if (BigDecimal.ZERO.equals(bestValue)) {
continue;
}
V mostValuableItem = mostValuableItems.get(bidder);
BigDecimal price = prices.get(mostValuableItem);
BigDecimal bid = price.add(bestValue).subtract(runnerUpValue).add(this.delta);
V formerOwner = owners.get(mostValuableItem);
if (formerOwner != null) {
unmatchedBidders.offer(formerOwner);
}
owners.put(mostValuableItem, bidder);
prices.put(mostValuableItem, bid);
}
for (V item: owners.keySet()) {
if (owners.get(item) != null) {
matching.add(graph.getEdge(item, owners.get(item)));
}
}
for(E edge: matching) {
this.matchingWeight = this.matchingWeight.add(BigDecimal.valueOf(graph.getEdgeWeight(edge)));
}
return new MatchingImpl<>(graph, matching, matchingWeight.doubleValue());
}
private V getItem(V bidder, E edge) {
if (swappedPartitions) {
return graph.getEdgeSource(edge);
}
else {
return graph.getEdgeTarget(edge);
}
}
private V getBidder(V item, E edge) {
if (swappedPartitions) {
return graph.getEdgeTarget(edge);
}
else {
return graph.getEdgeSource(edge);
}
}
public BigDecimal getMatchingWeight() {
return matchingWeight;
}
}

View File

@@ -61,12 +61,24 @@ public class Plate {
this.wells = wells;
this.size = wells.size();
double totalCellCount = 0.0;
double totalDropoutCount = 0.0;
List<Integer> concentrations = new ArrayList<>();
for (List<String[]> w: wells) {
if(!concentrations.contains(w.size())){
concentrations.add(w.size());
}
for (String[] cell: w) {
totalCellCount += 1.0;
for (String sequence: cell) {
if("-1".equals(sequence)) {
totalDropoutCount += 1.0;
}
}
}
}
double totalSequenceCount = totalCellCount * 4;
this.error = totalDropoutCount / totalSequenceCount;
this.populations = new Integer[concentrations.size()];
for (int i = 0; i < this.populations.length; i++) {
this.populations[i] = concentrations.get(i);

View File

@@ -93,8 +93,8 @@ public class PlateFileWriter {
printer.printComment("Cell source file name: " + sourceFileName);
printer.printComment("Each row represents one well on the plate.");
printer.printComment("Plate size: " + size);
printer.printComment("Error rate: " + error);
printer.printComment("Well populations: " + wellPopulationsString);
printer.printComment("Error rate: " + error);
if(isExponential){
printer.printComment("Lambda: " + lambda);
}

View File

@@ -183,32 +183,33 @@ public class Simulator implements GraphModificationFunctions {
"removed");}
//Find Maximum Weight Matching
//using jheaps library class PairingHeap for improved efficiency
if(verbose){System.out.println("Finding maximum weight matching");}
MaximumWeightBipartiteMatching maxWeightMatching;
//Use correct heap type for priority queue
String heapType = BiGpairSEQ.getPriorityQueueHeapType();
switch (heapType) {
case "PAIRING" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
alphas,
betas,
i -> new PairingHeap(Comparator.naturalOrder()));
//The matching object
MatchingAlgorithm<Vertex, DefaultWeightedEdge> maxWeightMatching;
//Determine algorithm type
AlgorithmType algorithm = BiGpairSEQ.getMatchingAlgoritmType();
switch (algorithm) { //Only two options now, but I have room to add more algorithms in the future this way
case AUCTION -> {
//create a new MaximumIntegerWeightBipartiteAuctionMatching
maxWeightMatching = new MaximumIntegerWeightBipartiteAuctionMatching<>(graph, alphas, betas);
}
case "FIBONACCI" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
alphas,
betas,
i -> new FibonacciHeap(Comparator.naturalOrder()));
}
default -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
alphas,
betas);
default -> { //HUNGARIAN
//use selected heap type for priority queue
HeapType heap = BiGpairSEQ.getPriorityQueueHeapType();
if(HeapType.PAIRING.equals(heap)) {
maxWeightMatching = new MaximumWeightBipartiteMatching<Vertex, DefaultWeightedEdge>(graph,
alphas,
betas,
i -> new PairingHeap(Comparator.naturalOrder()));
}
else {//Fibonacci is the default, and what's used in the JGraphT implementation
maxWeightMatching = new MaximumWeightBipartiteMatching<Vertex, DefaultWeightedEdge>(graph,
alphas,
betas);
}
}
}
//get the matching
MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = maxWeightMatching.getMatching();
MatchingAlgorithm.Matching<Vertex, DefaultWeightedEdge> matching = maxWeightMatching.getMatching();
if(verbose){System.out.println("Matching completed");}
Instant stop = Instant.now();
@@ -226,7 +227,7 @@ public class Simulator implements GraphModificationFunctions {
List<List<String>> allResults = new ArrayList<>();
NumberFormat nf = NumberFormat.getInstance(Locale.US);
MathContext mc = new MathContext(3);
Iterator<DefaultWeightedEdge> weightIter = graphMatching.iterator();
Iterator<DefaultWeightedEdge> weightIter = matching.iterator();
DefaultWeightedEdge e;
int trueCount = 0;
int falseCount = 0;
@@ -267,10 +268,19 @@ public class Simulator implements GraphModificationFunctions {
}
//Metadata comments for CSV file
String algoType = "LEDA book with heap: " + heapType;
String algoType;
switch(algorithm) {
case AUCTION -> {
algoType = "Auction algorithm";
}
default -> { //HUNGARIAN
algoType = "Hungarian algorithm with heap: " + BiGpairSEQ.getPriorityQueueHeapType().name();
}
}
int min = Math.min(graphAlphaCount, graphBetaCount);
//matching weight
BigDecimal totalMatchingWeight = maxWeightMatching.getMatchingWeight();
Double matchingWeight = matching.getWeight();
//rate of attempted matching
double attemptRate = (double) (trueCount + falseCount) / min;
BigDecimal attemptRateTrunc = new BigDecimal(attemptRate, mc);
@@ -309,7 +319,7 @@ public class Simulator implements GraphModificationFunctions {
metadata.put("sequence dropout rate", data.getDropoutRate().toString());
metadata.put("graph filename", dataFilename);
metadata.put("MWM algorithm type", algoType);
metadata.put("matching weight", totalMatchingWeight.toString());
metadata.put("matching weight", matchingWeight.toString());
metadata.put("well populations", wellPopulationsString);
metadata.put("sequence read depth", data.getReadDepth().toString());
metadata.put("sequence read error rate", data.getReadErrorRate().toString());
@@ -347,6 +357,7 @@ public class Simulator implements GraphModificationFunctions {
return output;
}
//Commented out CDR1 matching until it's time to re-implement it
// //Simulated matching of CDR1s to CDR3s. Requires MatchingResult from prior run of matchCDR3s.
// public static MatchingResult[] matchCDR1s(List<Integer[]> distinctCells,

View File

@@ -74,4 +74,12 @@ public class Vertex implements Serializable, Comparable<Vertex> {
public int compareTo(Vertex other) {
return this.vertexLabel - other.getVertexLabel();
}
public Double getPotential() {
return potential;
}
public void setPotential(Double potential) {
this.potential = potential;
}
}