40 Commits

Author SHA1 Message Date
dad0fd35fd Update readme to reflect wells with random population implemented 2022-02-24 15:47:08 -06:00
35d580cfcf Update readme to reflect wells with random population implemented 2022-02-24 15:45:03 -06:00
ab8d98ed81 Update readme to reflect new default caching behavior. 2022-02-24 15:39:15 -06:00
3d9890e16a Change GraphModificationFunctions to only save edges if graph data is cached 2022-02-24 15:32:27 -06:00
dd64ac2731 Change GraphModificationFunctions to interface 2022-02-24 15:18:09 -06:00
a5238624f1 Change default graph caching behavior to false 2022-02-24 15:14:28 -06:00
d8ba42b801 Fix Algorithm Options menu output 2022-02-24 14:59:08 -06:00
8edd89d784 Added heap type selection, fixed error handling 2022-02-24 14:48:19 -06:00
2829b88689 Update readme to reflect caching changes 2022-02-24 12:47:26 -06:00
108b0ec13f Improve options menu wording 2022-02-24 12:42:09 -06:00
a8b58d3f79 Output new setting when changing options 2022-02-24 12:38:15 -06:00
bf64d57731 implement option menu for file caching 2022-02-24 12:30:47 -06:00
c068c3db3c implement option menu for file caching 2022-02-23 20:35:31 -06:00
4bcda9b66c update readme 2022-02-23 13:22:04 -06:00
17ae763c6c Generate populations correctly 2022-02-23 10:37:40 -06:00
decdb147a9 Cache everything 2022-02-23 10:30:42 -06:00
74ffbfd8ac make everything use same random number generator 2022-02-23 09:29:21 -06:00
08699ce8ce Change output order to match interactive UI 2022-02-23 08:56:09 -06:00
69b0cc535c Error checking 2022-02-23 08:55:07 -06:00
e58f7b0a55 checking for possible divide by zero error. 2022-02-23 08:54:14 -06:00
dd2164c250 implement sample plates with random well populations 2022-02-23 08:14:17 -06:00
7323093bdc change "getRandomNumber" to "getRandomInt" for consistency. 2022-02-23 08:13:52 -06:00
f904cf6672 add more data caching code 2022-02-23 08:13:06 -06:00
3ccee9891b change "concentrations" to "populations" for consistency 2022-02-23 08:12:48 -06:00
40c2be1cfb create populations string correctly 2022-02-23 08:11:01 -06:00
4b597c4e5e remove old testing code 2022-02-23 08:10:35 -06:00
b2398531a3 Update readme 2022-02-23 05:11:36 +00:00
8e9a250890 Cache graph data on creation 2022-02-22 22:23:55 -06:00
e2a996c997 update readme 2022-02-22 22:23:40 -06:00
a5db89cb0b update readme 2022-02-22 22:13:01 -06:00
1630f9ccba Moved I/O alert to file reader 2022-02-22 22:11:50 -06:00
d785aa0da2 Moved I/O alert to file reader 2022-02-22 22:10:31 -06:00
a7afeb6119 bugfixes 2022-02-22 22:10:09 -06:00
f8167b0774 Add .jar manifest to repo 2022-02-22 21:45:46 -06:00
68ee9e4bb6 Implemented storing graphs in memory for multiple pairing experiments 2022-02-22 21:30:00 -06:00
fd2ec76b71 Realized how to store graph in memory 2022-02-22 19:42:35 -06:00
875f457a2d reimplement CLI (in progress) 2022-02-22 19:42:23 -06:00
906c06062f Added metadata to MatchingResult to enable CLI options 2022-02-22 18:36:30 -06:00
90ae2ff474 Re-implemeting CLI options (in progress) 2022-02-22 17:37:00 -06:00
7d983076f3 Add link to releases page for download 2022-02-22 16:34:24 -06:00
18 changed files with 1783 additions and 1205 deletions

View File

@@ -12,7 +12,7 @@ Unlike pairSEQ, which calculates p-values for every TCR alpha/beta overlap and c
against a null distribution, BiGpairSEQ does not do any statistical calculations against a null distribution, BiGpairSEQ does not do any statistical calculations
directly. directly.
BiGpairSEQ creates a [simple bipartite weighted graph](https://en.wikipedia.org/wiki/Bipartite_graph) representing the sample plate. BiGpairSEQ creates a [weightd bipartite graph](https://en.wikipedia.org/wiki/Bipartite_graph) representing the sample plate.
The distinct TCRA and TCRB sequences form the two sets of vertices. Every TCRA/TCRB pair that share a well The distinct TCRA and TCRB sequences form the two sets of vertices. Every TCRA/TCRB pair that share a well
are connected by an edge, with the edge weight set to the number of wells in which both sequences appear. are connected by an edge, with the edge weight set to the number of wells in which both sequences appear.
(Sequences present in *all* wells are filtered out prior to creating the graph, as there is no signal in their occupancy pattern.) (Sequences present in *all* wells are filtered out prior to creating the graph, as there is no signal in their occupancy pattern.)
@@ -29,15 +29,13 @@ Unfortunately, it's a fairly new algorithm, and not yet implemented by the graph
So this program instead uses the Fibonacci heap-based algorithm of Fredman and Tarjan (1987), which has a worst-case So this program instead uses the Fibonacci heap-based algorithm of Fredman and Tarjan (1987), which has a worst-case
runtime of **O(n (n log(n) + m))**. The algorithm is implemented as described in Melhorn and Näher (1999). runtime of **O(n (n log(n) + m))**. The algorithm is implemented as described in Melhorn and Näher (1999).
The current version of the program uses a pairing heap instead of a Fibonacci heap for its priority queue,
which has lower theoretical efficiency but also lower complexity overhead, and is often equivalently performant
in practice.
## USAGE ## USAGE
### RUNNING THE PROGRAM ### RUNNING THE PROGRAM
BiGpairSEQ_Sim is an executable .jar file. Requires Java 11 or higher. [OpenJDK 17](https://jdk.java.net/17/) [Download the current version of BiGpairSEQ_Sim.](https://gitea.ejsf.synology.me/efischer/BiGpairSEQ/releases)
BiGpairSEQ_Sim is an executable .jar file. Requires Java 14 or higher. [OpenJDK 17](https://jdk.java.net/17/)
recommended. recommended.
Run with the command: Run with the command:
@@ -63,20 +61,34 @@ Please select an option:
2) Generate a sample plate of T cells 2) Generate a sample plate of T cells
3) Generate CDR3 alpha/beta occupancy data and overlap graph 3) Generate CDR3 alpha/beta occupancy data and overlap graph
4) Simulate bipartite graph CDR3 alpha/beta matching (BiGpairSEQ) 4) Simulate bipartite graph CDR3 alpha/beta matching (BiGpairSEQ)
8) Options
9) About/Acknowledgments 9) About/Acknowledgments
0) Exit 0) Exit
``` ```
### OUTPUT ### INPUT/OUTPUT
To run the simulation, the program reads and writes 4 kinds of files: To run the simulation, the program reads and writes 4 kinds of files:
* Cell Sample files in CSV format * Cell Sample files in CSV format
* Sample Plate files in CSV format * Sample Plate files in CSV format
* Graph and Data files in binary object serialization format * Graph/Data files in binary object serialization format
* Matching Results files in CSV format * Matching Results files in CSV format
When entering filenames, it is not necessary to include the file extension (.csv or .ser). When reading or These files are often generated in sequence. When entering filenames, it is not necessary to include the file extension
writing files, the program will automatically add the correct extension to any filename without one. (.csv or .ser). When reading or writing files, the program will automatically add the correct extension to any filename without one.
To save file I/O time, the most recent instance of each of these four
files either generated or read from disk can be cached in program memory. This is could be important for Graph/Data files,
which can be several gigabytes in size. Since some simulations may require running multiple,
differently-configured BiGpairSEQ matchings on the same graph, keeping the most recent graph cached may reduce execution time.
(The manipulation necessary to re-use a graph incurs its own performance overhead, though, which may scale with graph
size faster than file I/O does. If so, caching is best for smaller graphs.)
When caching is active, subsequent uses of the same data file won't need to be read in again until another file of that type is used or generated,
or caching is turned off for that file type. The program checks whether it needs to update its cached data by comparing
filenames as entered by the user. On encountering a new filename, the program flushes its cache and reads in the new file.
The program's caching behavior can be controlled in the Options menu. By default, all caching is OFF.
#### Cell Sample Files #### Cell Sample Files
Cell Sample files consist of any number of distinct "T cells." Every cell contains Cell Sample files consist of any number of distinct "T cells." Every cell contains
@@ -119,15 +131,18 @@ Options when making a Sample Plate file:
* Standard deviation size * Standard deviation size
* Exponential * Exponential
* Lambda value * Lambda value
* (Based on the slope of the graph in Figure 4C of the pairSEQ paper, the distribution of the original experiment was exponential with a lambda of approximately 0.6. (Howie, et al. 2015)) * *(Based on the slope of the graph in Figure 4C of the pairSEQ paper, the distribution of the original experiment was approximately exponential with a lambda ~0.6. (Howie, et al. 2015))*
* Total number of wells on the plate * Total number of wells on the plate
* Number of sections on plate * Well populations random or fixed
* Number of T cells per well * If random, minimum and maximum population sizes
* per section, if more than one section * If fixed
* Number of sections on plate
* Number of T cells per well
* per section, if more than one section
* Dropout rate * Dropout rate
Files are in CSV format. There are no header labels. Every row represents a well. Files are in CSV format. There are no header labels. Every row represents a well.
Every column represents an individual cell, containing four sequences, depicted as an array string: Every value represents an individual cell, containing four sequences, depicted as an array string:
`[CDR3A, CDR3B, CDR1A, CDR1B]`. So a representative cell might look like this: `[CDR3A, CDR3B, CDR1A, CDR1B]`. So a representative cell might look like this:
`[525902, 791533, -1, 866282]` `[525902, 791533, -1, 866282]`
@@ -153,14 +168,16 @@ Structure:
--- ---
#### Graph and Data Files #### Graph/Data Files
Graph and Data files are serialized binaries of a Java object containing the weigthed bipartite graph representation of a Graph/Data files are serialized binaries of a Java object containing the weigthed bipartite graph representation of a
Sample Plate, along with the necessary metadata for matching and results output. Making them requires a Cell Sample file Sample Plate, along with the necessary metadata for matching and results output. Making them requires a Cell Sample file
(to construct a list of correct sequence pairs for checking the accuracy of BiGpairSEQ simulations) and a (to construct a list of correct sequence pairs for checking the accuracy of BiGpairSEQ simulations) and a
Sample Plate file (to construct the associated occupancy graph). These files can be several gigabytes in size. Sample Plate file (to construct the associated occupancy graph).
Writing them to a file lets us generate a graph and its metadata once, then use it for multiple different BiGpairSEQ simulations.
Options for creating a Graph and Data file: These files can be several gigabytes in size. Writing them to a file lets us generate a graph and its metadata once,
then use it for multiple different BiGpairSEQ simulations.
Options for creating a Graph/Data file:
* The Cell Sample file to use * The Cell Sample file to use
* The Sample Plate file to use. (This must have been generated from the selected Cell Sample file.) * The Sample Plate file to use. (This must have been generated from the selected Cell Sample file.)
@@ -170,8 +187,8 @@ portable data format may be implemented in the future. The tricky part is encodi
--- ---
#### Matching Results Files #### Matching Results Files
Matching results files consist of the results of a BiGpairSEQ matching simulation. Matching results files consist of the results of a BiGpairSEQ matching simulation. Making them requires a Graph and
Files are in CSV format. Rows are sequence pairings with extra relevant data. Columns are pairing-specific details. Data file. Matching results files are in CSV format. Rows are sequence pairings with extra relevant data. Columns are pairing-specific details.
Metadata about the matching simulation is included as comments. Comments are preceded by `#`. Metadata about the matching simulation is included as comments. Comments are preceded by `#`.
Options when running a BiGpairSEQ simulation of CDR3 alpha/beta matching: Options when running a BiGpairSEQ simulation of CDR3 alpha/beta matching:
@@ -237,14 +254,16 @@ slightly less time than the simulation itself. Real elapsed time from start to f
## TODO ## TODO
* ~~Try invoking GC at end of workloads to reduce paging to disk~~ DONE * ~~Try invoking GC at end of workloads to reduce paging to disk~~ DONE
* ~~Hold graph data in memory until another graph is read-in?~~ ABANDONED * Hold graph data in memory until another graph is read-in? ~~ABANDONED~~ ~~UNABANDONED~~ DONE
* *No, this won't work, because BiGpairSEQ simulations alter the underlying graph based on filtering constraints. Changes would cascade with multiple experiments.* * ~~*No, this won't work, because BiGpairSEQ simulations alter the underlying graph based on filtering constraints. Changes would cascade with multiple experiments.*~~
* Might have figured out a way to do it, by taking edges out and then putting them back into the graph. This may actually be possible.
* It is possible, though the modifications to the graph incur their own performance penalties. Need testing to see which option is best.
* See if there's a reasonable way to reformat Sample Plate files so that wells are columns instead of rows. * See if there's a reasonable way to reformat Sample Plate files so that wells are columns instead of rows.
* ~~Problem is variable number of cells in a well~~ * ~~Problem is variable number of cells in a well~~
* ~~Apache Commons CSV library writes entries a row at a time~~ * ~~Apache Commons CSV library writes entries a row at a time~~
* _Got this working, but at the cost of a profoundly strange bug in graph occupancy filtering. Have reverted the repo until I can figure out what caused that. Given how easily Thingiverse transposes CSV matrices in R, might not even be worth fixing._ * _Got this working, but at the cost of a profoundly strange bug in graph occupancy filtering. Have reverted the repo until I can figure out what caused that. Given how easily Thingiverse transposes CSV matrices in R, might not even be worth fixing._
* Re-implement command line arguments, to enable scripting and statistical simulation studies * Re-implement command line arguments, to enable scripting and statistical simulation studies
* Implement sample plates with random numbers of T cells per well. * ~~Implement sample plates with random numbers of T cells per well.~~ DONE
* Possible BiGpairSEQ advantage over pairSEQ: BiGpairSEQ is resilient to variations in well population sizes on a sample plate; pairSEQ is not. * Possible BiGpairSEQ advantage over pairSEQ: BiGpairSEQ is resilient to variations in well population sizes on a sample plate; pairSEQ is not.
* preliminary data suggests that BiGpairSEQ behaves roughly as though the whole plate had whatever the *average* well concentration is, but that's still speculative. * preliminary data suggests that BiGpairSEQ behaves roughly as though the whole plate had whatever the *average* well concentration is, but that's still speculative.
* Enable GraphML output in addition to serialized object binaries, for data portability * Enable GraphML output in addition to serialized object binaries, for data portability
@@ -252,9 +271,10 @@ slightly less time than the simulation itself. Real elapsed time from start to f
* Re-implement CDR1 matching method * Re-implement CDR1 matching method
* Implement Duan and Su's maximum weight matching algorithm * Implement Duan and Su's maximum weight matching algorithm
* Add controllable algorithm-type parameter? * Add controllable algorithm-type parameter?
* Test whether pairing heap (currently used) or Fibonacci heap is more efficient for priority queue in current matching algorithm * ~~Test whether pairing heap (currently used) or Fibonacci heap is more efficient for priority queue in current matching algorithm~~ DONE
* in theory Fibonacci heap should be more efficient, but complexity overhead may eliminate theoretical advantage * ~~in theory Fibonacci heap should be more efficient, but complexity overhead may eliminate theoretical advantage~~
* Add controllable heap-type parameter? * ~~Add controllable heap-type parameter?~~
* Parameter implemented. For large graphs, Fibonacci heap wins. Now the new default.

View File

@@ -0,0 +1,167 @@
import java.util.Random;
//main class. For choosing interface type and caching file data
public class BiGpairSEQ {
private static final Random rand = new Random();
private static CellSample cellSampleInMemory = null;
private static String cellFilename = null;
private static Plate plateInMemory = null;
private static String plateFilename = null;
private static GraphWithMapData graphInMemory = null;
private static String graphFilename = null;
private static boolean cacheCells = false;
private static boolean cachePlate = false;
private static boolean cacheGraph = false;
private static String priorityQueueHeapType = "FIBONACCI";
public static void main(String[] args) {
if (args.length == 0) {
InteractiveInterface.startInteractive();
}
else {
//This will be uncommented when command line arguments are re-implemented.
//CommandLineInterface.startCLI(args);
System.out.println("Command line arguments are still being re-implemented.");
}
}
public static Random getRand() {
return rand;
}
public static CellSample getCellSampleInMemory() {
return cellSampleInMemory;
}
public static void setCellSampleInMemory(CellSample cellSample, String filename) {
if(cellSampleInMemory != null) {
clearCellSampleInMemory();
}
cellSampleInMemory = cellSample;
cellFilename = filename;
System.out.println("Cell sample file " + filename + " cached.");
}
public static void clearCellSampleInMemory() {
cellSampleInMemory = null;
cellFilename = null;
System.gc();
System.out.println("Cell sample file cache cleared.");
}
public static String getCellFilename() {
return cellFilename;
}
public static Plate getPlateInMemory() {
return plateInMemory;
}
public static void setPlateInMemory(Plate plate, String filename) {
if(plateInMemory != null) {
clearPlateInMemory();
}
plateInMemory = plate;
plateFilename = filename;
System.out.println("Sample plate file " + filename + " cached.");
}
public static void clearPlateInMemory() {
plateInMemory = null;
plateFilename = null;
System.gc();
System.out.println("Sample plate file cache cleared.");
}
public static String getPlateFilename() {
return plateFilename;
}
public static GraphWithMapData getGraphInMemory() {return graphInMemory;
}
public static void setGraphInMemory(GraphWithMapData g, String filename) {
if (graphInMemory != null) {
clearGraphInMemory();
}
graphInMemory = g;
graphFilename = filename;
System.out.println("Graph and data file " + filename + " cached.");
}
public static void clearGraphInMemory() {
graphInMemory = null;
graphFilename = null;
System.gc();
System.out.println("Graph and data file cache cleared.");
}
public static String getGraphFilename() {
return graphFilename;
}
public static boolean cacheCells() {
return cacheCells;
}
public static void setCacheCells(boolean cacheCells) {
//if not caching, clear the memory
if(!cacheCells){
BiGpairSEQ.clearCellSampleInMemory();
System.out.println("Cell sample file caching: OFF.");
}
else {
System.out.println("Cell sample file caching: ON.");
}
BiGpairSEQ.cacheCells = cacheCells;
}
public static boolean cachePlate() {
return cachePlate;
}
public static void setCachePlate(boolean cachePlate) {
//if not caching, clear the memory
if(!cachePlate) {
BiGpairSEQ.clearPlateInMemory();
System.out.println("Sample plate file caching: OFF.");
}
else {
System.out.println("Sample plate file caching: ON.");
}
BiGpairSEQ.cachePlate = cachePlate;
}
public static boolean cacheGraph() {
return cacheGraph;
}
public static void setCacheGraph(boolean cacheGraph) {
//if not caching, clear the memory
if(!cacheGraph) {
BiGpairSEQ.clearGraphInMemory();
System.out.println("Graph/data file caching: OFF.");
}
else {
System.out.println("Graph/data file caching: ON.");
}
BiGpairSEQ.cacheGraph = cacheGraph;
}
public static String getPriorityQueueHeapType() {
return priorityQueueHeapType;
}
public static void setPairingHeap() {
priorityQueueHeapType = "PAIRING";
}
public static void setFibonacciHeap() {
priorityQueueHeapType = "FIBONACCI";
}
}

View File

@@ -13,6 +13,7 @@ public class CellFileReader {
private String filename; private String filename;
private List<Integer[]> distinctCells = new ArrayList<>(); private List<Integer[]> distinctCells = new ArrayList<>();
private Integer cdr1Freq;
public CellFileReader(String filename) { public CellFileReader(String filename) {
if(!filename.matches(".*\\.csv")){ if(!filename.matches(".*\\.csv")){
@@ -38,19 +39,37 @@ public class CellFileReader {
cell[3] = Integer.valueOf(record.get("Beta CDR1")); cell[3] = Integer.valueOf(record.get("Beta CDR1"));
distinctCells.add(cell); distinctCells.add(cell);
} }
} catch(IOException ex){ } catch(IOException ex){
System.out.println("cell file " + filename + " not found."); System.out.println("cell file " + filename + " not found.");
System.err.println(ex); System.err.println(ex);
} }
//get CDR1 frequency
ArrayList<Integer> cdr1Alphas = new ArrayList<>();
for (Integer[] cell : distinctCells) {
cdr1Alphas.add(cell[3]);
}
double count = cdr1Alphas.stream().distinct().count();
count = Math.ceil(distinctCells.size() / count);
cdr1Freq = (int) count;
}
public CellSample getCellSample() {
return new CellSample(distinctCells, cdr1Freq);
} }
public String getFilename() { return filename;} public String getFilename() { return filename;}
public List<Integer[]> getCells(){ //Refactor everything that uses this to have access to a Cell Sample and get the cells there instead.
public List<Integer[]> getListOfDistinctCellsDEPRECATED(){
return distinctCells; return distinctCells;
} }
public Integer getCellCount() { public Integer getCellCountDEPRECATED() {
//Refactor everything that uses this to have access to a Cell Sample and get the count there instead.
return distinctCells.size(); return distinctCells.size();
} }
} }

View File

@@ -18,7 +18,7 @@ public class CellSample {
return cdr1Freq; return cdr1Freq;
} }
public Integer population(){ public Integer getCellCount(){
return cells.size(); return cells.size();
} }

View File

@@ -0,0 +1,328 @@
import org.apache.commons.cli.*;
/*
* Class for parsing options passed to program from command line
*
* Top-level flags:
* cells : to make a cell sample file
* plate : to make a sample plate file
* graph : to make a graph and data file
* match : to do a cdr3 matching (WITH OR WITHOUT MAKING A RESULTS FILE. May just want to print summary for piping.)
*
* Cell flags:
* count : number of cells to generate
* diversity factor : factor by which CDR3s are more diverse than CDR1s
* output : name of the output file
*
* Plate flags:
* cellfile : name of the cell sample file to use as input
* wells : the number of wells on the plate
* dist : the statistical distribution to use
* (if exponential) lambda : the lambda value of the exponential distribution
* (if gaussian) stddev : the standard deviation of the gaussian distribution
* rand : randomize well populations, take a minimum argument and a maximum argument
* populations : number of t cells per well per section (number of arguments determines number of sections)
* dropout : plate dropout rate, double from 0.0 to 1.0
* output : name of the output file
*
* Graph flags:
* cellfile : name of the cell sample file to use as input
* platefile : name of the sample plate file to use as input
* output : name of the output file
*
* Match flags:
* graphFile : name of graph and data file to use as input
* min : minimum number of overlap wells to attempt a matching
* max : the maximum number of overlap wells to attempt a matching
* maxdiff : (optional) the maximum difference in occupancy to attempt a matching
* minpercent : (optional) the minimum percent overlap to attempt a matching.
* writefile : (optional) the filename to write results to
* output : the values to print to System.out for piping
*
*/
public class CommandLineInterface {
public static void startCLI(String[] args) {
//These command line options are a big mess
//Really, I don't think command line tools are expected to work in this many different modes
//making cells, making plates, and matching are the sort of thing that UNIX philosophy would say
//should be three separate programs.
//There might be a way to do it with option parameters?
//main options set
Options mainOptions = new Options();
Option makeCells = Option.builder("cells")
.longOpt("make-cells")
.desc("Makes a file of distinct cells")
.build();
Option makePlate = Option.builder("plates")
.longOpt("make-plates")
.desc("Makes a sample plate file")
.build();
Option makeGraph = Option.builder("graph")
.longOpt("make-graph")
.desc("Makes a graph and data file")
.build();
Option matchCDR3 = Option.builder("match")
.longOpt("match-cdr3")
.desc("Match CDR3s. Requires a cell sample file and any number of plate files.")
.build();
OptionGroup mainGroup = new OptionGroup();
mainGroup.addOption(makeCells);
mainGroup.addOption(makePlate);
mainGroup.addOption(makeGraph);
mainGroup.addOption(matchCDR3);
mainGroup.setRequired(true);
mainOptions.addOptionGroup(mainGroup);
//Reuse clones of this for other options groups, rather than making it lots of times
Option outputFile = Option.builder("o")
.longOpt("output-file")
.hasArg()
.argName("filename")
.desc("Name of output file")
.build();
mainOptions.addOption(outputFile);
//Options cellOptions = new Options();
Option numCells = Option.builder("nc")
.longOpt("num-cells")
.desc("The number of distinct cells to generate")
.hasArg()
.argName("number")
.build();
mainOptions.addOption(numCells);
Option cdr1Freq = Option.builder("d")
.longOpt("peptide-diversity-factor")
.hasArg()
.argName("number")
.desc("Number of distinct CDR3s for every CDR1")
.build();
mainOptions.addOption(cdr1Freq);
//Option cellOutput = (Option) outputFile.clone();
//cellOutput.setRequired(true);
//mainOptions.addOption(cellOutput);
//Options plateOptions = new Options();
Option inputCells = Option.builder("c")
.longOpt("cell-file")
.hasArg()
.argName("file")
.desc("The cell sample file used for filling wells")
.build();
mainOptions.addOption(inputCells);
Option numWells = Option.builder("w")
.longOpt("num-wells")
.hasArg()
.argName("number")
.desc("The number of wells on each plate")
.build();
mainOptions.addOption(numWells);
Option numPlates = Option.builder("np")
.longOpt("num-plates")
.hasArg()
.argName("number")
.desc("The number of plate files to output")
.build();
mainOptions.addOption(numPlates);
//Option plateOutput = (Option) outputFile.clone();
//plateOutput.setRequired(true);
//plateOutput.setDescription("Prefix for plate output filenames");
//mainOptions.addOption(plateOutput);
Option plateErr = Option.builder("err")
.longOpt("drop-out-rate")
.hasArg()
.argName("number")
.desc("Well drop-out rate. (Probability between 0 and 1)")
.build();
mainOptions.addOption(plateErr);
Option plateConcentrations = Option.builder("t")
.longOpt("t-cells-per-well")
.hasArgs()
.argName("number 1, number 2, ...")
.desc("Number of T cells per well for each plate section")
.build();
mainOptions.addOption(plateConcentrations);
//different distributions, mutually exclusive
OptionGroup plateDistributions = new OptionGroup();
Option plateExp = Option.builder("exponential")
.desc("Sample from distinct cells with exponential frequency distribution")
.build();
plateDistributions.addOption(plateExp);
Option plateGaussian = Option.builder("gaussian")
.desc("Sample from distinct cells with gaussain frequency distribution")
.build();
plateDistributions.addOption(plateGaussian);
Option platePoisson = Option.builder("poisson")
.desc("Sample from distinct cells with poisson frequency distribution")
.build();
plateDistributions.addOption(platePoisson);
mainOptions.addOptionGroup(plateDistributions);
Option plateStdDev = Option.builder("stddev")
.desc("Standard deviation for gaussian distribution")
.hasArg()
.argName("number")
.build();
mainOptions.addOption(plateStdDev);
Option plateLambda = Option.builder("lambda")
.desc("Lambda for exponential distribution")
.hasArg()
.argName("number")
.build();
mainOptions.addOption(plateLambda);
//
// String cellFile, String filename, Double stdDev,
// Integer numWells, Integer numSections,
// Integer[] concentrations, Double dropOutRate
//
//Options matchOptions = new Options();
inputCells.setDescription("The cell sample file to be used for matching.");
mainOptions.addOption(inputCells);
Option lowThresh = Option.builder("low")
.longOpt("low-threshold")
.hasArg()
.argName("number")
.desc("Sets the minimum occupancy overlap to attempt matching")
.build();
mainOptions.addOption(lowThresh);
Option highThresh = Option.builder("high")
.longOpt("high-threshold")
.hasArg()
.argName("number")
.desc("Sets the maximum occupancy overlap to attempt matching")
.build();
mainOptions.addOption(highThresh);
Option occDiff = Option.builder("occdiff")
.longOpt("occupancy-difference")
.hasArg()
.argName("Number")
.desc("Maximum difference in alpha/beta occupancy to attempt matching")
.build();
mainOptions.addOption(occDiff);
Option overlapPer = Option.builder("ovper")
.longOpt("overlap-percent")
.hasArg()
.argName("Percent")
.desc("Minimum overlap percent to attempt matching (0 -100)")
.build();
mainOptions.addOption(overlapPer);
Option inputPlates = Option.builder("p")
.longOpt("plate-files")
.hasArgs()
.desc("Plate files to match")
.build();
mainOptions.addOption(inputPlates);
CommandLineParser parser = new DefaultParser();
try {
CommandLine line = parser.parse(mainOptions, args);
if(line.hasOption("match")){
//line = parser.parse(mainOptions, args);
//String cellFile = line.getOptionValue("c");
String graphFile = line.getOptionValue("g");
Integer lowThreshold = Integer.valueOf(line.getOptionValue(lowThresh));
Integer highThreshold = Integer.valueOf(line.getOptionValue(highThresh));
Integer occupancyDifference = Integer.valueOf(line.getOptionValue(occDiff));
Integer overlapPercent = Integer.valueOf(line.getOptionValue(overlapPer));
for(String plate: line.getOptionValues("p")) {
matchCDR3s(graphFile, lowThreshold, highThreshold, occupancyDifference, overlapPercent);
}
}
else if(line.hasOption("cells")){
//line = parser.parse(mainOptions, args);
String filename = line.getOptionValue("o");
Integer numDistCells = Integer.valueOf(line.getOptionValue("nc"));
Integer freq = Integer.valueOf(line.getOptionValue("d"));
makeCells(filename, numDistCells, freq);
}
else if(line.hasOption("plates")){
//line = parser.parse(mainOptions, args);
String cellFile = line.getOptionValue("c");
String filenamePrefix = line.getOptionValue("o");
Integer numWellsOnPlate = Integer.valueOf(line.getOptionValue("w"));
Integer numPlatesToMake = Integer.valueOf(line.getOptionValue("np"));
String[] concentrationsToUseString = line.getOptionValues("t");
Integer numSections = concentrationsToUseString.length;
Integer[] concentrationsToUse = new Integer[numSections];
for(int i = 0; i <numSections; i++){
concentrationsToUse[i] = Integer.valueOf(concentrationsToUseString[i]);
}
Double dropOutRate = Double.valueOf(line.getOptionValue("err"));
if(line.hasOption("exponential")){
Double lambda = Double.valueOf(line.getOptionValue("lambda"));
for(int i = 1; i <= numPlatesToMake; i++){
makePlateExp(cellFile, filenamePrefix + i, lambda, numWellsOnPlate,
concentrationsToUse,dropOutRate);
}
}
else if(line.hasOption("gaussian")){
Double stdDev = Double.valueOf(line.getOptionValue("std-dev"));
for(int i = 1; i <= numPlatesToMake; i++){
makePlate(cellFile, filenamePrefix + i, stdDev, numWellsOnPlate,
concentrationsToUse,dropOutRate);
}
}
else if(line.hasOption("poisson")){
for(int i = 1; i <= numPlatesToMake; i++){
makePlatePoisson(cellFile, filenamePrefix + i, numWellsOnPlate,
concentrationsToUse,dropOutRate);
}
}
}
}
catch (ParseException exp) {
System.err.println("Parsing failed. Reason: " + exp.getMessage());
}
}
//for calling from command line
public static void makeCells(String filename, Integer numCells, Integer cdr1Freq){
CellSample sample = Simulator.generateCellSample(numCells, cdr1Freq);
CellFileWriter writer = new CellFileWriter(filename, sample);
writer.writeCellsToFile();
}
public static void makePlateExp(String cellFile, String filename, Double lambda,
Integer numWells, Integer[] concentrations, Double dropOutRate){
CellFileReader cellReader = new CellFileReader(cellFile);
Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
samplePlate.fillWellsExponential(cellReader.getFilename(), cellReader.getListOfDistinctCellsDEPRECATED(), lambda);
PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
writer.writePlateFile();
}
private static void makePlatePoisson(String cellFile, String filename, Integer numWells,
Integer[] concentrations, Double dropOutRate){
CellFileReader cellReader = new CellFileReader(cellFile);
Double stdDev = Math.sqrt(cellReader.getCellCountDEPRECATED());
Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
samplePlate.fillWells(cellReader.getFilename(), cellReader.getListOfDistinctCellsDEPRECATED(), stdDev);
PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
writer.writePlateFile();
}
private static void makePlate(String cellFile, String filename, Double stdDev,
Integer numWells, Integer[] concentrations, Double dropOutRate){
CellFileReader cellReader = new CellFileReader(cellFile);
Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
samplePlate.fillWells(cellReader.getFilename(), cellReader.getListOfDistinctCellsDEPRECATED(), stdDev);
PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
writer.writePlateFile();
}
private static void matchCDR3s(String graphFile, Integer lowThreshold, Integer highThreshold,
Integer occupancyDifference, Integer overlapPercent) {
}
}

View File

@@ -4,10 +4,6 @@ import java.math.MathContext;
public abstract class Equations { public abstract class Equations {
public static int getRandomNumber(int min, int max) {
return (int) ((Math.random() * (max - min)) + min);
}
//pValue calculation as described in original pairSEQ paper. //pValue calculation as described in original pairSEQ paper.
//Included for comparison with original results. //Included for comparison with original results.
//Not used by BiGpairSEQ for matching. //Not used by BiGpairSEQ for matching.

View File

@@ -13,6 +13,8 @@ public class GraphDataObjectReader {
BufferedInputStream fileIn = new BufferedInputStream(new FileInputStream(filename)); BufferedInputStream fileIn = new BufferedInputStream(new FileInputStream(filename));
ObjectInputStream in = new ObjectInputStream(fileIn)) ObjectInputStream in = new ObjectInputStream(fileIn))
{ {
System.out.println("Reading graph data from file. This may take some time");
System.out.println("File I/O time is not included in results");
data = (GraphWithMapData) in.readObject(); data = (GraphWithMapData) in.readObject();
} catch (FileNotFoundException | ClassNotFoundException ex) { } catch (FileNotFoundException | ClassNotFoundException ex) {
ex.printStackTrace(); ex.printStackTrace();

View File

@@ -18,8 +18,11 @@ public class GraphDataObjectWriter {
public void writeDataToFile() { public void writeDataToFile() {
try (BufferedOutputStream bufferedOut = new BufferedOutputStream(new FileOutputStream(filename)); try (BufferedOutputStream bufferedOut = new BufferedOutputStream(new FileOutputStream(filename));
ObjectOutputStream out = new ObjectOutputStream(bufferedOut); ObjectOutputStream out = new ObjectOutputStream(bufferedOut);
){ ){
System.out.println("Writing graph and occupancy data to file. This may take some time.");
System.out.println("File I/O time is not included in results.");
out.writeObject(data); out.writeObject(data);
} catch (IOException ex) { } catch (IOException ex) {
ex.printStackTrace(); ex.printStackTrace();

View File

@@ -0,0 +1,112 @@
import org.jgrapht.graph.DefaultWeightedEdge;
import org.jgrapht.graph.SimpleWeightedGraph;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Set;
public interface GraphModificationFunctions {
//remove over- and under-weight edges
static List<Integer[]> filterByOverlapThresholds(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
int low, int high, boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
if ((graph.getEdgeWeight(e) > high) || (graph.getEdgeWeight(e) < low)) {
if(saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer weight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, weight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
//Remove edges for pairs with large occupancy discrepancy
static List<Integer[]> filterByRelativeOccupancy(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer maxOccupancyDifference, boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
Integer betaOcc = betaWellCounts.get(plateVtoBMap.get(graph.getEdgeTarget(e)));
if (Math.abs(alphaOcc - betaOcc) >= maxOccupancyDifference) {
if (saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer weight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, weight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
//Remove edges for pairs where overlap size is significantly lower than the well occupancy
static List<Integer[]> filterByOverlapPercent(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer minOverlapPercent,
boolean saveEdges) {
List<Integer[]> removedEdges = new ArrayList<>();
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
Integer betaOcc = betaWellCounts.get(plateVtoBMap.get(graph.getEdgeTarget(e)));
double weight = graph.getEdgeWeight(e);
double min = minOverlapPercent / 100.0;
if ((weight / alphaOcc < min) || (weight / betaOcc < min)) {
if(saveEdges) {
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
Integer intWeight = (int) graph.getEdgeWeight(e);
Integer[] edge = {source, target, intWeight};
removedEdges.add(edge);
}
else {
graph.setEdgeWeight(e, 0.0);
}
}
}
if(saveEdges) {
for (Integer[] edge : removedEdges) {
graph.removeEdge(edge[0], edge[1]);
}
}
return removedEdges;
}
static void addRemovedEdges(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
List<Integer[]> removedEdges) {
for (Integer[] edge : removedEdges) {
DefaultWeightedEdge e = graph.addEdge(edge[0], edge[1]);
graph.setEdgeWeight(e, (double) edge[2]);
}
}
}

View File

@@ -11,7 +11,7 @@ public class GraphWithMapData implements java.io.Serializable {
private String sourceFilename; private String sourceFilename;
private final SimpleWeightedGraph graph; private final SimpleWeightedGraph graph;
private Integer numWells; private Integer numWells;
private Integer[] wellConcentrations; private Integer[] wellPopulations;
private Integer alphaCount; private Integer alphaCount;
private Integer betaCount; private Integer betaCount;
private final Map<Integer, Integer> distCellsMapAlphaKey; private final Map<Integer, Integer> distCellsMapAlphaKey;
@@ -31,7 +31,7 @@ public class GraphWithMapData implements java.io.Serializable {
Map<Integer, Integer> betaWellCounts, Duration time) { Map<Integer, Integer> betaWellCounts, Duration time) {
this.graph = graph; this.graph = graph;
this.numWells = numWells; this.numWells = numWells;
this.wellConcentrations = wellConcentrations; this.wellPopulations = wellConcentrations;
this.alphaCount = alphaCount; this.alphaCount = alphaCount;
this.betaCount = betaCount; this.betaCount = betaCount;
this.distCellsMapAlphaKey = distCellsMapAlphaKey; this.distCellsMapAlphaKey = distCellsMapAlphaKey;
@@ -52,8 +52,8 @@ public class GraphWithMapData implements java.io.Serializable {
return numWells; return numWells;
} }
public Integer[] getWellConcentrations() { public Integer[] getWellPopulations() {
return wellConcentrations; return wellPopulations;
} }
public Integer getAlphaCount() { public Integer getAlphaCount() {

View File

@@ -0,0 +1,579 @@
import java.io.IOException;
import java.util.*;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
//
public class InteractiveInterface {
private static final Random rand = BiGpairSEQ.getRand();
private static final Scanner sc = new Scanner(System.in);
private static int input;
private static boolean quit = false;
public static void startInteractive() {
while (!quit) {
System.out.println();
System.out.println("--------BiGPairSEQ SIMULATOR--------");
System.out.println("ALPHA/BETA T CELL RECEPTOR MATCHING");
System.out.println(" USING WEIGHTED BIPARTITE GRAPHS ");
System.out.println("------------------------------------");
System.out.println("Please select an option:");
System.out.println("1) Generate a population of distinct cells");
System.out.println("2) Generate a sample plate of T cells");
System.out.println("3) Generate CDR3 alpha/beta occupancy data and overlap graph");
System.out.println("4) Simulate bipartite graph CDR3 alpha/beta matching (BiGpairSEQ)");
//Need to re-do the CDR3/CDR1 matching to correspond to new pattern
//System.out.println("5) Generate CDR3/CDR1 occupancy graph");
//System.out.println("6) Simulate CDR3/CDR1 T cell matching");
System.out.println("8) Options");
System.out.println("9) About/Acknowledgments");
System.out.println("0) Exit");
try {
input = sc.nextInt();
switch (input) {
case 1 -> makeCells();
case 2 -> makePlate();
case 3 -> makeCDR3Graph();
case 4 -> matchCDR3s();
//case 6 -> matchCellsCDR1();
case 8 -> mainOptions();
case 9 -> acknowledge();
case 0 -> quit = true;
default -> System.out.println("Invalid input.");
}
} catch (InputMismatchException | IOException ex) {
System.out.println(ex);
sc.next();
}
}
sc.close();
}
private static void makeCells() {
String filename = null;
Integer numCells = 0;
Integer cdr1Freq = 1;
try {
System.out.println("\nSimulated T-Cells consist of integer values representing:\n" +
"* a pair of alpha and beta CDR3 peptides (unique within simulated population)\n" +
"* a pair of alpha and beta CDR1 peptides (not necessarily unique).");
System.out.println("\nThe cells will be written to a CSV file.");
System.out.print("Please enter a file name: ");
filename = sc.next();
System.out.println("\nCDR3 sequences are more diverse than CDR1 sequences.");
System.out.println("Please enter the factor by which distinct CDR3s outnumber CDR1s: ");
cdr1Freq = sc.nextInt();
System.out.print("\nPlease enter the number of T-cells to generate: ");
numCells = sc.nextInt();
if(numCells <= 0){
throw new InputMismatchException("Number of cells must be a positive integer.");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
CellSample sample = Simulator.generateCellSample(numCells, cdr1Freq);
assert filename != null;
System.out.println("Writing cells to file");
CellFileWriter writer = new CellFileWriter(filename, sample);
writer.writeCellsToFile();
System.out.println("Cell sample written to: " + filename);
if(BiGpairSEQ.cacheCells()) {
BiGpairSEQ.setCellSampleInMemory(sample, filename);
}
}
//Output a CSV of sample plate
private static void makePlate() {
String cellFile = null;
String filename = null;
Double stdDev = 0.0;
Integer numWells = 0;
Integer numSections;
Integer[] populations = {1};
Double dropOutRate = 0.0;
boolean poisson = false;
boolean exponential = false;
double lambda = 1.5;
try {
System.out.println("\nSimulated sample plates consist of:");
System.out.println("* a number of wells");
System.out.println(" * separated into one or more sections");
System.out.println(" * each of which has a set quantity of cells per well");
System.out.println(" * selected from a statistical distribution of distinct cells");
System.out.println(" * with a set dropout rate for individual sequences within a cell");
System.out.println("\nMaking a sample plate requires a population of distinct cells");
System.out.print("Please enter name of an existing cell sample file: ");
cellFile = sc.next();
System.out.println("\nThe sample plate will be written to a CSV file");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
System.out.println("\nSelect T-cell frequency distribution function");
System.out.println("1) Poisson");
System.out.println("2) Gaussian");
System.out.println("3) Exponential");
System.out.println("(Note: approximate distribution in original paper is exponential, lambda = 0.6)");
System.out.println("(lambda value approximated from slope of log-log graph in figure 4c)");
System.out.println("(Note: wider distributions are more memory intensive to match)");
System.out.print("Enter selection value: ");
input = sc.nextInt();
switch (input) {
case 1 -> poisson = true;
case 2 -> {
System.out.println("How many distinct T-cells within one standard deviation of peak frequency?");
System.out.println("(Note: wider distributions are more memory intensive to match)");
stdDev = sc.nextDouble();
if (stdDev <= 0.0) {
throw new InputMismatchException("Value must be positive.");
}
}
case 3 -> {
exponential = true;
System.out.print("Please enter lambda value for exponential distribution: ");
lambda = sc.nextDouble();
if (lambda <= 0.0) {
lambda = 0.6;
System.out.println("Value must be positive. Defaulting to 0.6.");
}
}
default -> {
System.out.println("Invalid input. Defaulting to exponential.");
exponential = true;
}
}
System.out.print("\nNumber of wells on plate: ");
numWells = sc.nextInt();
if(numWells < 1){
throw new InputMismatchException("No wells on plate");
}
//choose whether to make T cell population/well random
boolean randomWellPopulations;
System.out.println("Randomize number of T cells in each well? (y/n)");
String ans = sc.next();
Pattern pattern = Pattern.compile("(?:yes|y)", Pattern.CASE_INSENSITIVE);
Matcher matcher = pattern.matcher(ans);
if(matcher.matches()){
randomWellPopulations = true;
}
else{
randomWellPopulations = false;
}
if(randomWellPopulations) { //if T cell population/well is random
numSections = numWells;
Integer minPop;
Integer maxPop;
System.out.print("Please enter minimum number of T cells in a well: ");
minPop = sc.nextInt();
if(minPop < 1) {
throw new InputMismatchException("Minimum well population must be positive");
}
System.out.println("Please enter maximum number of T cells in a well: ");
maxPop = sc.nextInt();
if(maxPop < minPop) {
throw new InputMismatchException("Max well population must be greater than min well population");
}
//maximum should be inclusive, so need to add one to max of randomly generated values
populations = rand.ints(minPop, maxPop + 1)
.limit(numSections)
.boxed()
.toArray(Integer[]::new);
System.out.print("Populations: ");
System.out.println(Arrays.toString(populations));
}
else{ //if T cell population/well is not random
System.out.println("\nThe plate can be evenly sectioned to allow different numbers of T cells per well.");
System.out.println("How many sections would you like to make (minimum 1)?");
numSections = sc.nextInt();
if (numSections < 1) {
throw new InputMismatchException("Too few sections.");
} else if (numSections > numWells) {
throw new InputMismatchException("Cannot have more sections than wells.");
}
int i = 1;
populations = new Integer[numSections];
while (numSections > 0) {
System.out.print("Enter number of T cells per well in section " + i + ": ");
populations[i - 1] = sc.nextInt();
i++;
numSections--;
}
}
System.out.println("\nErrors in amplification can induce a well dropout rate for sequences");
System.out.print("Enter well dropout rate (0.0 to 1.0): ");
dropOutRate = sc.nextDouble();
if(dropOutRate < 0.0 || dropOutRate > 1.0) {
throw new InputMismatchException("The well dropout rate must be in the range [0.0, 1.0]");
}
}catch(InputMismatchException ex){
System.out.println(ex);
sc.next();
}
assert cellFile != null;
CellSample cells;
if (cellFile.equals(BiGpairSEQ.getCellFilename())){
cells = BiGpairSEQ.getCellSampleInMemory();
}
else {
System.out.println("Reading Cell Sample file: " + cellFile);
CellFileReader cellReader = new CellFileReader(cellFile);
cells = cellReader.getCellSample();
if(BiGpairSEQ.cacheCells()) {
BiGpairSEQ.setCellSampleInMemory(cells, cellFile);
}
}
assert filename != null;
Plate samplePlate;
PlateFileWriter writer;
if(exponential){
samplePlate = new Plate(numWells, dropOutRate, populations);
samplePlate.fillWellsExponential(cellFile, cells.getCells(), lambda);
writer = new PlateFileWriter(filename, samplePlate);
}
else {
if (poisson) {
stdDev = Math.sqrt(cells.getCellCount()); //gaussian with square root of elements approximates poisson
}
samplePlate = new Plate(numWells, dropOutRate, populations);
samplePlate.fillWells(cellFile, cells.getCells(), stdDev);
writer = new PlateFileWriter(filename, samplePlate);
}
System.out.println("Writing Sample Plate to file");
writer.writePlateFile();
System.out.println("Sample Plate written to file: " + filename);
if(BiGpairSEQ.cachePlate()) {
BiGpairSEQ.setPlateInMemory(samplePlate, filename);
}
}
//Output serialized binary of GraphAndMapData object
private static void makeCDR3Graph() {
String filename = null;
String cellFile = null;
String plateFile = null;
try {
String str = "\nGenerating bipartite weighted graph encoding occupancy overlap data ";
str = str.concat("\nrequires a cell sample file and a sample plate file.");
System.out.println(str);
System.out.print("\nPlease enter name of an existing cell sample file: ");
cellFile = sc.next();
System.out.print("\nPlease enter name of an existing sample plate file: ");
plateFile = sc.next();
System.out.println("\nThe graph and occupancy data will be written to a serialized binary file.");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
assert cellFile != null;
CellSample cellSample;
//check if cells are already in memory
if(cellFile.equals(BiGpairSEQ.getCellFilename()) && BiGpairSEQ.getCellSampleInMemory() != null) {
cellSample = BiGpairSEQ.getCellSampleInMemory();
}
else {
System.out.println("Reading Cell Sample file: " + cellFile);
CellFileReader cellReader = new CellFileReader(cellFile);
cellSample = cellReader.getCellSample();
if(BiGpairSEQ.cacheCells()) {
BiGpairSEQ.setCellSampleInMemory(cellSample, cellFile);
}
}
assert plateFile != null;
Plate plate;
//check if plate is already in memory
if(plateFile.equals(BiGpairSEQ.getPlateFilename())){
plate = BiGpairSEQ.getPlateInMemory();
}
else {
System.out.println("Reading Sample Plate file: " + plateFile);
PlateFileReader plateReader = new PlateFileReader(plateFile);
plate = new Plate(plateReader.getFilename(), plateReader.getWells());
if(BiGpairSEQ.cachePlate()) {
BiGpairSEQ.setPlateInMemory(plate, plateFile);
}
}
if (cellSample.getCells().size() == 0){
System.out.println("No cell sample found.");
System.out.println("Returning to main menu.");
}
else if(plate.getWells().size() == 0 || plate.getPopulations().length == 0){
System.out.println("No sample plate found.");
System.out.println("Returning to main menu.");
}
else{
List<Integer[]> cells = cellSample.getCells();
GraphWithMapData data = Simulator.makeGraph(cells, plate, true);
assert filename != null;
GraphDataObjectWriter dataWriter = new GraphDataObjectWriter(filename, data);
dataWriter.writeDataToFile();
System.out.println("Graph and Data file written to: " + filename);
if(BiGpairSEQ.cacheGraph()) {
BiGpairSEQ.setGraphInMemory(data, filename);
}
}
}
//Simulate matching and output CSV file of results
private static void matchCDR3s() throws IOException {
String filename = null;
String graphFilename = null;
Integer lowThreshold = 0;
Integer highThreshold = Integer.MAX_VALUE;
Integer maxOccupancyDiff = Integer.MAX_VALUE;
Integer minOverlapPercent = 0;
try {
System.out.println("\nBiGpairSEQ simulation requires an occupancy data and overlap graph file");
System.out.println("Please enter name of an existing graph and occupancy data file: ");
graphFilename = sc.next();
System.out.println("The matching results will be written to a file.");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
System.out.println("\nWhat is the minimum number of CDR3 alpha/beta overlap wells to attempt matching?");
lowThreshold = sc.nextInt();
if(lowThreshold < 1){
lowThreshold = 1;
System.out.println("Value for low occupancy overlap threshold must be positive");
System.out.println("Value for low occupancy overlap threshold set to 1");
}
System.out.println("\nWhat is the maximum number of CDR3 alpha/beta overlap wells to attempt matching?");
highThreshold = sc.nextInt();
if(highThreshold < lowThreshold) {
highThreshold = lowThreshold;
System.out.println("Value for high occupancy overlap threshold must be >= low overlap threshold");
System.out.println("Value for high occupancy overlap threshold set to " + lowThreshold);
}
System.out.println("What is the minimum percentage of a sequence's wells in alpha/beta overlap to attempt matching? (0 - 100)");
minOverlapPercent = sc.nextInt();
if (minOverlapPercent < 0 || minOverlapPercent > 100) {
System.out.println("Value outside range. Minimum occupancy overlap percentage set to 0");
}
System.out.println("\nWhat is the maximum difference in alpha/beta occupancy to attempt matching?");
maxOccupancyDiff = sc.nextInt();
if (maxOccupancyDiff < 0) {
maxOccupancyDiff = 0;
System.out.println("Maximum allowable difference in alpha/beta occupancy must be nonnegative");
System.out.println("Maximum allowable difference in alpha/beta occupancy set to 0");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
assert graphFilename != null;
//check if this is the same graph we already have in memory.
GraphWithMapData data;
if(graphFilename.equals(BiGpairSEQ.getGraphFilename())) {
data = BiGpairSEQ.getGraphInMemory();
}
else {
GraphDataObjectReader dataReader = new GraphDataObjectReader(graphFilename);
data = dataReader.getData();
if(BiGpairSEQ.cacheGraph()) {
BiGpairSEQ.setGraphInMemory(data, graphFilename);
}
}
//simulate matching
MatchingResult results = Simulator.matchCDR3s(data, graphFilename, lowThreshold, highThreshold, maxOccupancyDiff,
minOverlapPercent, true);
//write results to file
assert filename != null;
MatchingFileWriter writer = new MatchingFileWriter(filename, results);
System.out.println("Writing results to file");
writer.writeResultsToFile();
System.out.println("Results written to file: " + filename);
}
///////
//Rewrite this to fit new matchCDR3 method with file I/O
///////
// public static void matchCellsCDR1(){
// /*
// The idea here is that we'll get the CDR3 alpha/beta matches first. Then we'll try to match CDR3s to CDR1s by
// looking at the top two matches for each CDR3. If CDR3s in the same cell simply swap CDR1s, we assume a correct
// match
// */
// String filename = null;
// String preliminaryResultsFilename = null;
// String cellFile = null;
// String plateFile = null;
// Integer lowThresholdCDR3 = 0;
// Integer highThresholdCDR3 = Integer.MAX_VALUE;
// Integer maxOccupancyDiffCDR3 = 96; //no filtering if max difference is all wells by default
// Integer minOverlapPercentCDR3 = 0; //no filtering if min percentage is zero by default
// Integer lowThresholdCDR1 = 0;
// Integer highThresholdCDR1 = Integer.MAX_VALUE;
// boolean outputCDR3Matches = false;
// try {
// System.out.println("\nSimulated experiment requires a cell sample file and a sample plate file.");
// System.out.print("Please enter name of an existing cell sample file: ");
// cellFile = sc.next();
// System.out.print("Please enter name of an existing sample plate file: ");
// plateFile = sc.next();
// System.out.println("The matching results will be written to a file.");
// System.out.print("Please enter a name for the output file: ");
// filename = sc.next();
// System.out.println("What is the minimum number of CDR3 alpha/beta overlap wells to attempt matching?");
// lowThresholdCDR3 = sc.nextInt();
// if(lowThresholdCDR3 < 1){
// throw new InputMismatchException("Minimum value for low threshold is 1");
// }
// System.out.println("What is the maximum number of CDR3 alpha/beta overlap wells to attempt matching?");
// highThresholdCDR3 = sc.nextInt();
// System.out.println("What is the maximum difference in CDR3 alpha/beta occupancy to attempt matching?");
// maxOccupancyDiffCDR3 = sc.nextInt();
// System.out.println("What is the minimum CDR3 overlap percentage to attempt matching? (0 - 100)");
// minOverlapPercentCDR3 = sc.nextInt();
// if (minOverlapPercentCDR3 < 0 || minOverlapPercentCDR3 > 100) {
// throw new InputMismatchException("Value outside range. Minimum percent set to 0");
// }
// System.out.println("What is the minimum number of CDR3/CDR1 overlap wells to attempt matching?");
// lowThresholdCDR1 = sc.nextInt();
// if(lowThresholdCDR1 < 1){
// throw new InputMismatchException("Minimum value for low threshold is 1");
// }
// System.out.println("What is the maximum number of CDR3/CDR1 overlap wells to attempt matching?");
// highThresholdCDR1 = sc.nextInt();
// System.out.println("Matching CDR3s to CDR1s requires first matching CDR3 alpha/betas.");
// System.out.println("Output a file for CDR3 alpha/beta match results as well?");
// System.out.print("Please enter y/n: ");
// String ans = sc.next();
// Pattern pattern = Pattern.compile("(?:yes|y)", Pattern.CASE_INSENSITIVE);
// Matcher matcher = pattern.matcher(ans);
// if(matcher.matches()){
// outputCDR3Matches = true;
// System.out.println("Please enter filename for CDR3 alpha/beta match results");
// preliminaryResultsFilename = sc.next();
// System.out.println("CDR3 alpha/beta matches will be output to file");
// }
// else{
// System.out.println("CDR3 alpha/beta matches will not be output to file");
// }
// } catch (InputMismatchException ex) {
// System.out.println(ex);
// sc.next();
// }
// CellFileReader cellReader = new CellFileReader(cellFile);
// PlateFileReader plateReader = new PlateFileReader(plateFile);
// Plate plate = new Plate(plateReader.getFilename(), plateReader.getWells());
// if (cellReader.getCells().size() == 0){
// System.out.println("No cell sample found.");
// System.out.println("Returning to main menu.");
// }
// else if(plate.getWells().size() == 0){
// System.out.println("No sample plate found.");
// System.out.println("Returning to main menu.");
//
// }
// else{
// if(highThresholdCDR3 >= plate.getSize()){
// highThresholdCDR3 = plate.getSize() - 1;
// }
// if(highThresholdCDR1 >= plate.getSize()){
// highThresholdCDR1 = plate.getSize() - 1;
// }
// List<Integer[]> cells = cellReader.getCells();
// MatchingResult preliminaryResults = Simulator.matchCDR3s(cells, plate, lowThresholdCDR3, highThresholdCDR3,
// maxOccupancyDiffCDR3, minOverlapPercentCDR3, true);
// MatchingResult[] results = Simulator.matchCDR1s(cells, plate, lowThresholdCDR1,
// highThresholdCDR1, preliminaryResults);
// MatchingFileWriter writer = new MatchingFileWriter(filename + "_FirstPass", results[0]);
// writer.writeResultsToFile();
// writer = new MatchingFileWriter(filename + "_SecondPass", results[1]);
// writer.writeResultsToFile();
// if(outputCDR3Matches){
// writer = new MatchingFileWriter(preliminaryResultsFilename, preliminaryResults);
// writer.writeResultsToFile();
// }
// }
// }
private static void mainOptions(){
boolean backToMain = false;
while(!backToMain) {
System.out.println("\n--------------OPTIONS---------------");
System.out.println("1) Turn " + getOnOff(!BiGpairSEQ.cacheCells()) + " cell sample file caching");
System.out.println("2) Turn " + getOnOff(!BiGpairSEQ.cachePlate()) + " plate file caching");
System.out.println("3) Turn " + getOnOff(!BiGpairSEQ.cacheGraph()) + " graph/data file caching");
System.out.println("4) Maximum weight matching algorithm options");
System.out.println("0) Return to main menu");
try {
input = sc.nextInt();
switch (input) {
case 1 -> BiGpairSEQ.setCacheCells(!BiGpairSEQ.cacheCells());
case 2 -> BiGpairSEQ.setCachePlate(!BiGpairSEQ.cachePlate());
case 3 -> BiGpairSEQ.setCacheGraph(!BiGpairSEQ.cacheGraph());
case 4 -> algorithmOptions();
case 0 -> backToMain = true;
default -> System.out.println("Invalid input");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
}
}
/**
* Helper function for printing menu items in mainOptions(). Returns a string based on the value of parameter.
*
* @param b - a boolean value
* @return String "on" if b is true, "off" if b is false
*/
private static String getOnOff(boolean b) {
if (b) { return "on";}
else { return "off"; }
}
private static void algorithmOptions(){
boolean backToOptions = false;
while(!backToOptions) {
System.out.println("\n---------ALGORITHM OPTIONS----------");
System.out.println("1) Use scaling algorithm by Duan and Su.");
System.out.println("2) Use LEDA book algorithm with Fibonacci heap priority queue");
System.out.println("3) Use LEDA book algorithm with pairing heap priority queue");
System.out.println("0) Return to Options menu");
try {
input = sc.nextInt();
switch (input) {
case 1 -> System.out.println("This option is not yet implemented. Choose another.");
case 2 -> {
BiGpairSEQ.setFibonacciHeap();
System.out.println("MWM algorithm set to LEDA with Fibonacci heap");
backToOptions = true;
}
case 3 -> {
BiGpairSEQ.setPairingHeap();
System.out.println("MWM algorithm set to LEDA with pairing heap");
backToOptions = true;
}
case 0 -> backToOptions = true;
default -> System.out.println("Invalid input");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
}
}
private static void acknowledge(){
System.out.println("This program simulates BiGpairSEQ, a graph theory based adaptation");
System.out.println("of the pairSEQ algorithm for pairing T cell receptor sequences.");
System.out.println();
System.out.println("For full documentation, view readme.md file distributed with this code");
System.out.println("or visit https://gitea.ejsf.synology.me/efischer/BiGpairSEQ.");
System.out.println();
System.out.println("pairSEQ citation:");
System.out.println("Howie, B., Sherwood, A. M., et. al.");
System.out.println("High-throughput pairing of T cell receptor alpha and beta sequences.");
System.out.println("Sci. Transl. Med. 7, 301ra131 (2015)");
System.out.println();
System.out.println("BiGpairSEQ_Sim by Eugene Fischer, 2021-2022");
}
}

View File

@@ -0,0 +1,3 @@
Manifest-Version: 1.0
Main-Class: BiGpairSEQ

View File

@@ -7,13 +7,10 @@ import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.StandardOpenOption; import java.nio.file.StandardOpenOption;
import java.util.List; import java.util.List;
import java.util.regex.Pattern;
public class MatchingFileWriter { public class MatchingFileWriter {
private String filename; private String filename;
private String sourceFileName;
private List<String> comments; private List<String> comments;
private List<String> headers; private List<String> headers;
private List<List<String>> allResults; private List<List<String>> allResults;
@@ -23,7 +20,6 @@ public class MatchingFileWriter {
filename = filename + ".csv"; filename = filename + ".csv";
} }
this.filename = filename; this.filename = filename;
this.sourceFileName = result.getSourceFileName();
this.comments = result.getComments(); this.comments = result.getComments();
this.headers = result.getHeaders(); this.headers = result.getHeaders();
this.allResults = result.getAllResults(); this.allResults = result.getAllResults();

View File

@@ -1,18 +1,41 @@
import java.time.Duration; import java.time.Duration;
import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
public class MatchingResult { public class MatchingResult {
private String sourceFile;
private List<String> comments;
private List<String> headers;
private List<List<String>> allResults;
private Map<Integer, Integer> matchMap;
private Duration time;
public MatchingResult(String sourceFileName, List<String> comments, List<String> headers, List<List<String>> allResults, Map<Integer, Integer>matchMap, Duration time){ private final Map<String, String> metadata;
this.sourceFile = sourceFileName; private final List<String> comments;
this.comments = comments; private final List<String> headers;
private final List<List<String>> allResults;
private final Map<Integer, Integer> matchMap;
private final Duration time;
public MatchingResult(Map<String, String> metadata, List<String> headers,
List<List<String>> allResults, Map<Integer, Integer>matchMap, Duration time){
/*
* POSSIBLE KEYS FOR METADATA MAP ARE:
* sample plate filename *
* graph filename *
* well populations *
* total alphas found *
* total betas found *
* high overlap threshold
* low overlap threshold
* maximum occupancy difference
* minimum overlap percent
* pairing attempt rate
* correct pairing count
* incorrect pairing count
* pairing error rate
* simulation time
*/
this.metadata = metadata;
this.comments = new ArrayList<>();
for (String key : metadata.keySet()) {
comments.add(key +": " + metadata.get(key));
}
this.headers = headers; this.headers = headers;
this.allResults = allResults; this.allResults = allResults;
this.matchMap = matchMap; this.matchMap = matchMap;
@@ -20,6 +43,8 @@ public class MatchingResult {
} }
public Map<String, String> getMetadata() {return metadata;}
public List<String> getComments() { public List<String> getComments() {
return comments; return comments;
} }
@@ -40,7 +65,32 @@ public class MatchingResult {
return time; return time;
} }
public String getSourceFileName() { public String getPlateFilename() {
return sourceFile; return metadata.get("sample plate filename");
} }
public String getGraphFilename() {
return metadata.get("graph filename");
}
public Integer[] getWellPopulations() {
List<Integer> wellPopulations = new ArrayList<>();
String popString = metadata.get("well populations");
for (String p : popString.split(", ")) {
wellPopulations.add(Integer.parseInt(p));
}
Integer[] popArray = new Integer[wellPopulations.size()];
return wellPopulations.toArray(popArray);
}
public Integer getAlphaCount() {
return Integer.parseInt(metadata.get("total alpha count"));
}
public Integer getBetaCount() {
return Integer.parseInt(metadata.get("total beta count"));
}
//put in the rest of these methods following the same pattern
} }

View File

@@ -1,26 +1,28 @@
import java.util.*;
/* /*
TODO: Implement exponential distribution using inversion method - DONE TODO: Implement exponential distribution using inversion method - DONE
TODO: Implement discrete frequency distributions using Vose's Alias Method TODO: Implement discrete frequency distributions using Vose's Alias Method
*/ */
import java.util.*;
public class Plate { public class Plate {
private String sourceFile; private String sourceFile;
private List<List<Integer[]>> wells; private List<List<Integer[]>> wells;
private Random rand = new Random(); private final Random rand = BiGpairSEQ.getRand();
private int size; private int size;
private double error; private double error;
private Integer[] concentrations; private Integer[] populations;
private double stdDev; private double stdDev;
private double lambda; private double lambda;
boolean exponential = false; boolean exponential = false;
public Plate(int size, double error, Integer[] concentrations) { public Plate(int size, double error, Integer[] populations) {
this.size = size; this.size = size;
this.error = error; this.error = error;
this.concentrations = concentrations; this.populations = populations;
wells = new ArrayList<>(); wells = new ArrayList<>();
} }
@@ -35,9 +37,9 @@ public class Plate {
concentrations.add(w.size()); concentrations.add(w.size());
} }
} }
this.concentrations = new Integer[concentrations.size()]; this.populations = new Integer[concentrations.size()];
for (int i = 0; i < this.concentrations.length; i++) { for (int i = 0; i < this.populations.length; i++) {
this.concentrations[i] = concentrations.get(i); this.populations[i] = concentrations.get(i);
} }
} }
@@ -45,27 +47,19 @@ public class Plate {
this.lambda = lambda; this.lambda = lambda;
exponential = true; exponential = true;
sourceFile = sourceFileName; sourceFile = sourceFileName;
int numSections = concentrations.length; int numSections = populations.length;
int section = 0; int section = 0;
double m; double m;
int n; int n;
int test=0;
while (section < numSections){ while (section < numSections){
for (int i = 0; i < (size / numSections); i++) { for (int i = 0; i < (size / numSections); i++) {
List<Integer[]> well = new ArrayList<>(); List<Integer[]> well = new ArrayList<>();
for (int j = 0; j < concentrations[section]; j++) { for (int j = 0; j < populations[section]; j++) {
do { do {
//inverse transform sampling: for random number u in [0,1), x = log(1-u) / (-lambda) //inverse transform sampling: for random number u in [0,1), x = log(1-u) / (-lambda)
m = (Math.log10((1 - rand.nextDouble()))/(-lambda)) * Math.sqrt(cells.size()); m = (Math.log10((1 - rand.nextDouble()))/(-lambda)) * Math.sqrt(cells.size());
} while (m >= cells.size() || m < 0); } while (m >= cells.size() || m < 0);
n = (int) Math.floor(m); n = (int) Math.floor(m);
//n = Equations.getRandomNumber(0, cells.size());
// was testing generating the cell sample file with exponential dist, then sampling flat here
//that would be more realistic
//But would mess up other things in the simulation with how I've coded it.
if(n > test){
test = n;
}
Integer[] cellToAdd = cells.get(n).clone(); Integer[] cellToAdd = cells.get(n).clone();
for(int k = 0; k < cellToAdd.length; k++){ for(int k = 0; k < cellToAdd.length; k++){
if(Math.abs(rand.nextDouble()) < error){//error applied to each seqeunce if(Math.abs(rand.nextDouble()) < error){//error applied to each seqeunce
@@ -78,20 +72,19 @@ public class Plate {
} }
section++; section++;
} }
System.out.println("Highest index: " +test);
} }
public void fillWells(String sourceFileName, List<Integer[]> cells, double stdDev) { public void fillWells(String sourceFileName, List<Integer[]> cells, double stdDev) {
this.stdDev = stdDev; this.stdDev = stdDev;
sourceFile = sourceFileName; sourceFile = sourceFileName;
int numSections = concentrations.length; int numSections = populations.length;
int section = 0; int section = 0;
double m; double m;
int n; int n;
while (section < numSections){ while (section < numSections){
for (int i = 0; i < (size / numSections); i++) { for (int i = 0; i < (size / numSections); i++) {
List<Integer[]> well = new ArrayList<>(); List<Integer[]> well = new ArrayList<>();
for (int j = 0; j < concentrations[section]; j++) { for (int j = 0; j < populations[section]; j++) {
do { do {
m = (rand.nextGaussian() * stdDev) + (cells.size() / 2); m = (rand.nextGaussian() * stdDev) + (cells.size() / 2);
} while (m >= cells.size() || m < 0); } while (m >= cells.size() || m < 0);
@@ -110,8 +103,8 @@ public class Plate {
} }
} }
public Integer[] getConcentrations(){ public Integer[] getPopulations(){
return concentrations; return populations;
} }
public int getSize(){ public int getSize(){

View File

@@ -7,7 +7,6 @@ import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.StandardOpenOption; import java.nio.file.StandardOpenOption;
import java.util.*; import java.util.*;
import java.util.regex.Pattern;
public class PlateFileWriter { public class PlateFileWriter {
private int size; private int size;
@@ -17,8 +16,7 @@ public class PlateFileWriter {
private Double error; private Double error;
private String filename; private String filename;
private String sourceFileName; private String sourceFileName;
private String[] headers; private Integer[] populations;
private List<Integer> concentrations;
private boolean isExponential = false; private boolean isExponential = false;
public PlateFileWriter(String filename, Plate plate) { public PlateFileWriter(String filename, Plate plate) {
@@ -37,8 +35,8 @@ public class PlateFileWriter {
} }
this.error = plate.getError(); this.error = plate.getError();
this.wells = plate.getWells(); this.wells = plate.getWells();
this.concentrations = Arrays.asList(plate.getConcentrations()); this.populations = plate.getPopulations();
concentrations.sort(Comparator.reverseOrder()); Arrays.sort(populations);
} }
public void writePlateFile(){ public void writePlateFile(){
@@ -59,28 +57,32 @@ public class PlateFileWriter {
} }
} }
//this took forever // //this took forever and I don't use it
List<List<String>> rows = new ArrayList<>(); // //if I wanted to use it, I'd replace printer.printRecords(wellsAsStrings) with printer.printRecords(rows)
List<String> tmp = new ArrayList<>(); // List<List<String>> rows = new ArrayList<>();
for(int i = 0; i < wellsAsStrings.size(); i++){//List<Integer[]> w: wells){ // List<String> tmp = new ArrayList<>();
tmp.add("well " + (i+1)); // for(int i = 0; i < wellsAsStrings.size(); i++){//List<Integer[]> w: wells){
} // tmp.add("well " + (i+1));
rows.add(tmp); // }
for(int row = 0; row < maxLength; row++){ // rows.add(tmp);
tmp = new ArrayList<>(); // for(int row = 0; row < maxLength; row++){
for(List<String> c: wellsAsStrings){ // tmp = new ArrayList<>();
tmp.add(c.get(row)); // for(List<String> c: wellsAsStrings){
} // tmp.add(c.get(row));
rows.add(tmp); // }
} // rows.add(tmp);
//build string of well concentrations // }
StringBuilder concen = new StringBuilder();
for(Integer i: concentrations){
concen.append(i.toString());
concen.append(" ");
}
String concenString = concen.toString();
//make string out of populations array
StringBuilder populationsStringBuilder = new StringBuilder();
populationsStringBuilder.append(populations[0].toString());
for(int i = 1; i < populations.length; i++){
populationsStringBuilder.append(", ");
populationsStringBuilder.append(populations[i].toString());
}
String wellPopulationsString = populationsStringBuilder.toString();
//set CSV format
CSVFormat plateFileFormat = CSVFormat.Builder.create() CSVFormat plateFileFormat = CSVFormat.Builder.create()
.setCommentMarker('#') .setCommentMarker('#')
.build(); .build();
@@ -92,7 +94,7 @@ public class PlateFileWriter {
printer.printComment("Each row represents one well on the plate."); printer.printComment("Each row represents one well on the plate.");
printer.printComment("Plate size: " + size); printer.printComment("Plate size: " + size);
printer.printComment("Error rate: " + error); printer.printComment("Error rate: " + error);
printer.printComment("Concentrations: " + concenString); printer.printComment("Well populations: " + wellPopulationsString);
if(isExponential){ if(isExponential){
printer.printComment("Lambda: " + lambda); printer.printComment("Lambda: " + lambda);
} }

View File

@@ -3,6 +3,7 @@ import org.jgrapht.alg.matching.MaximumWeightBipartiteMatching;
import org.jgrapht.generate.SimpleWeightedBipartiteGraphMatrixGenerator; import org.jgrapht.generate.SimpleWeightedBipartiteGraphMatrixGenerator;
import org.jgrapht.graph.DefaultWeightedEdge; import org.jgrapht.graph.DefaultWeightedEdge;
import org.jgrapht.graph.SimpleWeightedGraph; import org.jgrapht.graph.SimpleWeightedGraph;
import org.jheaps.tree.FibonacciHeap;
import org.jheaps.tree.PairingHeap; import org.jheaps.tree.PairingHeap;
import java.math.BigDecimal; import java.math.BigDecimal;
@@ -13,8 +14,10 @@ import java.time.Duration;
import java.util.*; import java.util.*;
import java.util.stream.IntStream; import java.util.stream.IntStream;
import static java.lang.Float.*;
//NOTE: "sequence" in method and variable names refers to a peptide sequence from a simulated T cell //NOTE: "sequence" in method and variable names refers to a peptide sequence from a simulated T cell
public class Simulator { public class Simulator implements GraphModificationFunctions {
private static final int cdr3AlphaIndex = 0; private static final int cdr3AlphaIndex = 0;
private static final int cdr3BetaIndex = 1; private static final int cdr3BetaIndex = 1;
private static final int cdr1AlphaIndex = 2; private static final int cdr1AlphaIndex = 2;
@@ -49,6 +52,7 @@ public class Simulator {
Instant start = Instant.now(); Instant start = Instant.now();
int[] alphaIndex = {cdr3AlphaIndex}; int[] alphaIndex = {cdr3AlphaIndex};
int[] betaIndex = {cdr3BetaIndex}; int[] betaIndex = {cdr3BetaIndex};
int numWells = samplePlate.getSize(); int numWells = samplePlate.getSize();
if(verbose){System.out.println("Making cell maps");} if(verbose){System.out.println("Making cell maps");}
@@ -63,15 +67,11 @@ public class Simulator {
if(verbose){System.out.println("All alphas count: " + alphaCount);} if(verbose){System.out.println("All alphas count: " + alphaCount);}
int betaCount = allBetas.size(); int betaCount = allBetas.size();
if(verbose){System.out.println("All betas count: " + betaCount);} if(verbose){System.out.println("All betas count: " + betaCount);}
if(verbose){System.out.println("Well maps made");} if(verbose){System.out.println("Well maps made");}
//Remove saturating-occupancy sequences because they have no signal value.
//Remove sequences with total occupancy below minimum pair overlap threshold
if(verbose){System.out.println("Removing sequences present in all wells.");} if(verbose){System.out.println("Removing sequences present in all wells.");}
//if(verbose){System.out.println("Removing sequences with occupancy below the minimum overlap threshold");} filterByOccupancyThresholds(allAlphas, 1, numWells - 1);
filterByOccupancyThreshold(allAlphas, 1, numWells - 1); filterByOccupancyThresholds(allBetas, 1, numWells - 1);
filterByOccupancyThreshold(allBetas, 1, numWells - 1);
if(verbose){System.out.println("Sequences removed");} if(verbose){System.out.println("Sequences removed");}
int pairableAlphaCount = allAlphas.size(); int pairableAlphaCount = allAlphas.size();
if(verbose){System.out.println("Remaining alphas count: " + pairableAlphaCount);} if(verbose){System.out.println("Remaining alphas count: " + pairableAlphaCount);}
@@ -131,10 +131,15 @@ public class Simulator {
Instant stop = Instant.now(); Instant stop = Instant.now();
Duration time = Duration.between(start, stop); Duration time = Duration.between(start, stop);
//return GraphWithMapData object
return new GraphWithMapData(graph, numWells, samplePlate.getConcentrations(), alphaCount, betaCount, //create GraphWithMapData object
GraphWithMapData output = new GraphWithMapData(graph, numWells, samplePlate.getPopulations(), alphaCount, betaCount,
distCellsMapAlphaKey, plateVtoAMap, plateVtoBMap, plateAtoVMap, distCellsMapAlphaKey, plateVtoAMap, plateVtoBMap, plateAtoVMap,
plateBtoVMap, alphaWellCounts, betaWellCounts, time); plateBtoVMap, alphaWellCounts, betaWellCounts, time);
//Set source file name in graph to name of sample plate
output.setSourceFilename(samplePlate.getSourceFileName());
//return GraphWithMapData object
return output;
} }
//match CDR3s. //match CDR3s.
@@ -142,6 +147,8 @@ public class Simulator {
Integer highThreshold, Integer maxOccupancyDifference, Integer highThreshold, Integer maxOccupancyDifference,
Integer minOverlapPercent, boolean verbose) { Integer minOverlapPercent, boolean verbose) {
Instant start = Instant.now(); Instant start = Instant.now();
List<Integer[]> removedEdges = new ArrayList<>();
boolean saveEdges = BiGpairSEQ.cacheGraph();
int numWells = data.getNumWells(); int numWells = data.getNumWells();
Integer alphaCount = data.getAlphaCount(); Integer alphaCount = data.getAlphaCount();
Integer betaCount = data.getBetaCount(); Integer betaCount = data.getBetaCount();
@@ -152,33 +159,52 @@ public class Simulator {
Map<Integer, Integer> betaWellCounts = data.getBetaWellCounts(); Map<Integer, Integer> betaWellCounts = data.getBetaWellCounts();
SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph = data.getGraph(); SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph = data.getGraph();
//remove weights outside given overlap thresholds //remove edges with weights outside given overlap thresholds, add those to removed edge list
if(verbose){System.out.println("Eliminating edges with weights outside overlap threshold values");} if(verbose){System.out.println("Eliminating edges with weights outside overlap threshold values");}
filterByOccupancyThreshold(graph, lowThreshold, highThreshold); removedEdges.addAll(GraphModificationFunctions.filterByOverlapThresholds(graph, lowThreshold, highThreshold, saveEdges));
if(verbose){System.out.println("Over- and under-weight edges set to 0.0");} if(verbose){System.out.println("Over- and under-weight edges removed");}
//Filter by overlap size //remove edges between vertices with too small an overlap size, add those to removed edge list
if(verbose){System.out.println("Eliminating edges with weights less than " + minOverlapPercent.toString() + if(verbose){System.out.println("Eliminating edges with weights less than " + minOverlapPercent.toString() +
" percent of vertex occupancy value.");} " percent of vertex occupancy value.");}
filterByOverlapSize(graph, alphaWellCounts, betaWellCounts, plateVtoAMap, plateVtoBMap, minOverlapPercent); removedEdges.addAll(GraphModificationFunctions.filterByOverlapPercent(graph, alphaWellCounts, betaWellCounts,
if(verbose){System.out.println("Edges with weights too far below vertex occupancy values set to 0.0");} plateVtoAMap, plateVtoBMap, minOverlapPercent, saveEdges));
if(verbose){System.out.println("Edges with weights too far below a vertex occupancy value removed");}
//Filter by relative occupancy //Filter by relative occupancy
if(verbose){System.out.println("Eliminating edges between vertices with occupancy difference > " if(verbose){System.out.println("Eliminating edges between vertices with occupancy difference > "
+ maxOccupancyDifference);} + maxOccupancyDifference);}
filterByRelativeOccupancy(graph, alphaWellCounts, betaWellCounts, plateVtoAMap, plateVtoBMap, removedEdges.addAll(GraphModificationFunctions.filterByRelativeOccupancy(graph, alphaWellCounts, betaWellCounts,
maxOccupancyDifference); plateVtoAMap, plateVtoBMap, maxOccupancyDifference, saveEdges));
if(verbose){System.out.println("Edges between vertices of with excessively different occupancy values " + if(verbose){System.out.println("Edges between vertices of with excessively different occupancy values " +
"set to 0.0");} "removed");}
//Find Maximum Weighted Matching //Find Maximum Weighted Matching
//using jheaps library class PairingHeap for improved efficiency //using jheaps library class PairingHeap for improved efficiency
if(verbose){System.out.println("Finding maximum weighted matching");} if(verbose){System.out.println("Finding maximum weighted matching");}
//Attempting to use addressable heap to improve performance MaximumWeightBipartiteMatching maxWeightMatching;
MaximumWeightBipartiteMatching maxWeightMatching = //Use correct heap type for priority queue
new MaximumWeightBipartiteMatching(graph, String heapType = BiGpairSEQ.getPriorityQueueHeapType();
switch (heapType) {
case "PAIRING" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(), plateVtoAMap.keySet(),
plateVtoBMap.keySet(), plateVtoBMap.keySet(),
i -> new PairingHeap(Comparator.naturalOrder())); i -> new PairingHeap(Comparator.naturalOrder()));
}
case "FIBONACCI" -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(),
plateVtoBMap.keySet(),
i -> new FibonacciHeap(Comparator.naturalOrder()));
}
default -> {
maxWeightMatching = new MaximumWeightBipartiteMatching(graph,
plateVtoAMap.keySet(),
plateVtoBMap.keySet());
}
}
//get the matching
MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = maxWeightMatching.getMatching(); MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = maxWeightMatching.getMatching();
if(verbose){System.out.println("Matching completed");} if(verbose){System.out.println("Matching completed");}
Instant stop = Instant.now(); Instant stop = Instant.now();
@@ -233,351 +259,383 @@ public class Simulator {
allResults.add(result); allResults.add(result);
} }
//Metadate comments for CSV file //Metadata comments for CSV file
int min = Math.min(alphaCount, betaCount); int min = Math.min(alphaCount, betaCount);
//rate of attempted matching
double attemptRate = (double) (trueCount + falseCount) / min; double attemptRate = (double) (trueCount + falseCount) / min;
BigDecimal attemptRateTrunc = new BigDecimal(attemptRate, mc); BigDecimal attemptRateTrunc = new BigDecimal(attemptRate, mc);
//rate of pairing error
double pairingErrorRate = (double) falseCount / (trueCount + falseCount); double pairingErrorRate = (double) falseCount / (trueCount + falseCount);
BigDecimal pairingErrorRateTrunc = new BigDecimal(pairingErrorRate, mc); BigDecimal pairingErrorRateTrunc;
//get list of well concentrations if(pairingErrorRate == NaN || pairingErrorRate == POSITIVE_INFINITY || pairingErrorRate == NEGATIVE_INFINITY) {
List<Integer> wellConcentrations = Arrays.asList(data.getWellConcentrations()); pairingErrorRateTrunc = new BigDecimal(-1, mc);
//make string out of concentrations list
StringBuilder concentrationStringBuilder = new StringBuilder();
for(Integer i: wellConcentrations){
concentrationStringBuilder.append(i.toString());
concentrationStringBuilder.append(" ");
} }
String concentrationString = concentrationStringBuilder.toString(); else{
pairingErrorRateTrunc = new BigDecimal(pairingErrorRate, mc);
List<String> comments = new ArrayList<>(); }
comments.add("Source Sample Plate filename: " + data.getSourceFilename()); //get list of well populations
comments.add("Source Graph and Data filename: " + dataFilename); Integer[] wellPopulations = data.getWellPopulations();
comments.add("T cell counts in sample plate wells: " + concentrationString); //make string out of populations list
comments.add("Total alphas found: " + alphaCount); StringBuilder populationsStringBuilder = new StringBuilder();
comments.add("Total betas found: " + betaCount); populationsStringBuilder.append(wellPopulations[0].toString());
comments.add("High overlap threshold: " + highThreshold); for(int i = 1; i < wellPopulations.length; i++){
comments.add("Low overlap threshold: " + lowThreshold); populationsStringBuilder.append(", ");
comments.add("Minimum overlap percent: " + minOverlapPercent); populationsStringBuilder.append(wellPopulations[i].toString());
comments.add("Maximum occupancy difference: " + maxOccupancyDifference); }
comments.add("Pairing attempt rate: " + attemptRateTrunc); String wellPopulationsString = populationsStringBuilder.toString();
comments.add("Correct pairings: " + trueCount); //total simulation time
comments.add("Incorrect pairings: " + falseCount);
comments.add("Pairing error rate: " + pairingErrorRateTrunc);
Duration time = Duration.between(start, stop); Duration time = Duration.between(start, stop);
time = time.plus(data.getTime()); time = time.plus(data.getTime());
comments.add("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
Map<String, String> metadata = new LinkedHashMap<>();
metadata.put("sample plate filename", data.getSourceFilename());
metadata.put("graph filename", dataFilename);
metadata.put("well populations", wellPopulationsString);
metadata.put("total alphas found", alphaCount.toString());
metadata.put("total betas found", betaCount.toString());
metadata.put("high overlap threshold", highThreshold.toString());
metadata.put("low overlap threshold", lowThreshold.toString());
metadata.put("minimum overlap percent", minOverlapPercent.toString());
metadata.put("maximum occupancy difference", maxOccupancyDifference.toString());
metadata.put("pairing attempt rate", attemptRateTrunc.toString());
metadata.put("correct pairing count", Integer.toString(trueCount));
metadata.put("incorrect pairing count", Integer.toString(falseCount));
metadata.put("pairing error rate", pairingErrorRateTrunc.toString());
metadata.put("simulation time", nf.format(time.toSeconds()));
//create MatchingResult object
MatchingResult output = new MatchingResult(metadata, header, allResults, matchMap, time);
if(verbose){ if(verbose){
for(String s: comments){ for(String s: output.getComments()){
System.out.println(s); System.out.println(s);
} }
} }
return new MatchingResult(data.getSourceFilename(), comments, header, allResults, matchMap, time);
}
//Simulated matching of CDR1s to CDR3s. Requires MatchingResult from prior run of matchCDR3s. if(saveEdges) {
public static MatchingResult[] matchCDR1s(List<Integer[]> distinctCells, //put the removed edges back on the graph
Plate samplePlate, Integer lowThreshold, System.out.println("Restoring removed edges to graph.");
Integer highThreshold, MatchingResult priorResult){ GraphModificationFunctions.addRemovedEdges(graph, removedEdges);
Instant start = Instant.now();
Duration previousTime = priorResult.getTime();
Map<Integer, Integer> previousMatches = priorResult.getMatchMap();
int numWells = samplePlate.getSize();
int[] cdr3Indices = {cdr3AlphaIndex, cdr3BetaIndex};
int[] cdr1Indices = {cdr1AlphaIndex, cdr1BetaIndex};
System.out.println("Making previous match maps");
Map<Integer, Integer> cdr3AtoBMap = previousMatches;
Map<Integer, Integer> cdr3BtoAMap = invertVertexMap(cdr3AtoBMap);
System.out.println("Previous match maps made");
System.out.println("Making cell maps");
Map<Integer, Integer> alphaCDR3toCDR1Map = makeSequenceToSequenceMap(distinctCells, cdr3AlphaIndex, cdr1AlphaIndex);
Map<Integer, Integer> betaCDR3toCDR1Map = makeSequenceToSequenceMap(distinctCells, cdr3BetaIndex, cdr1BetaIndex);
System.out.println("Cell maps made");
System.out.println("Making well maps");
Map<Integer, Integer> allCDR3s = samplePlate.assayWellsSequenceS(cdr3Indices);
Map<Integer, Integer> allCDR1s = samplePlate.assayWellsSequenceS(cdr1Indices);
int CDR3Count = allCDR3s.size();
System.out.println("all CDR3s count: " + CDR3Count);
int CDR1Count = allCDR1s.size();
System.out.println("all CDR1s count: " + CDR1Count);
System.out.println("Well maps made");
System.out.println("Removing unpaired CDR3s from well maps");
List<Integer> unpairedCDR3s = new ArrayList<>();
for(Integer i: allCDR3s.keySet()){
if(!(cdr3AtoBMap.containsKey(i) || cdr3BtoAMap.containsKey(i))){
unpairedCDR3s.add(i);
}
} }
for(Integer i: unpairedCDR3s){ //return MatchingResult object
allCDR3s.remove(i);
}
System.out.println("Unpaired CDR3s removed.");
System.out.println("Remaining CDR3 count: " + allCDR3s.size());
System.out.println("Removing below-minimum-overlap-threshold and saturating-occupancy CDR1s");
filterByOccupancyThreshold(allCDR1s, lowThreshold, numWells - 1);
System.out.println("CDR1s removed.");
System.out.println("Remaining CDR1 count: " + allCDR1s.size());
System.out.println("Making vertex maps");
//For the SimpleWeightedBipartiteGraphMatrixGenerator, all vertices must have
// distinct numbers associated with them. Since I'm using a 2D array, that means
// distinct indices between the rows and columns. vertexStartValue lets me track where I switch
// from numbering rows to columns, so I can assign unique numbers to every vertex, and then
// subtract the vertexStartValue from CDR1s to use their vertex labels as array indices
Integer vertexStartValue = 0;
//keys are sequential integer vertices, values are CDR3s
Map<Integer, Integer> plateVtoCDR3Map = makeVertexToSequenceMap(allCDR3s, vertexStartValue);
//New start value for vertex to CDR1 map should be one more than final vertex value in CDR3 map
vertexStartValue += plateVtoCDR3Map.size();
//keys are sequential integers vertices, values are CDR1s
Map<Integer, Integer> plateVtoCDR1Map = makeVertexToSequenceMap(allCDR1s, vertexStartValue);
//keys are CDR3s, values are sequential integer vertices from previous map
Map<Integer, Integer> plateCDR3toVMap = invertVertexMap(plateVtoCDR3Map);
//keys are CDR1s, values are sequential integer vertices from previous map
Map<Integer, Integer> plateCDR1toVMap = invertVertexMap(plateVtoCDR1Map);
System.out.println("Vertex maps made");
System.out.println("Creating adjacency matrix");
//Count how many wells each CDR3 appears in
Map<Integer, Integer> cdr3WellCounts = new HashMap<>();
//count how many wells each CDR1 appears in
Map<Integer, Integer> cdr1WellCounts = new HashMap<>();
//add edges, where weights are number of wells the peptides share in common.
//If this is too slow, can make a 2d array and use the SimpleWeightedGraphMatrixGenerator class
Map<Integer, Integer> wellNCDR3s = null;
Map<Integer, Integer> wellNCDR1s = null;
double[][] weights = new double[plateVtoCDR3Map.size()][plateVtoCDR1Map.size()];
countSequencesAndFillMatrix(samplePlate, allCDR3s, allCDR1s, plateCDR3toVMap, plateCDR1toVMap,
cdr3Indices, cdr1Indices, cdr3WellCounts, cdr1WellCounts, weights);
System.out.println("Matrix created");
System.out.println("Creating graph");
SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph =
new SimpleWeightedGraph<>(DefaultWeightedEdge.class);
SimpleWeightedBipartiteGraphMatrixGenerator graphGenerator = new SimpleWeightedBipartiteGraphMatrixGenerator();
List<Integer> cdr3Vertices = new ArrayList<>(plateVtoCDR3Map.keySet()); //This will work because LinkedHashMap preserves order of entry
graphGenerator.first(cdr3Vertices);
List<Integer> cdr1Vertices = new ArrayList<>(plateVtoCDR1Map.keySet());
graphGenerator.second(cdr1Vertices); //This will work because LinkedHashMap preserves order of entry
graphGenerator.weights(weights);
graphGenerator.generateGraph(graph);
System.out.println("Graph created");
System.out.println("Removing edges outside of weight thresholds");
filterByOccupancyThreshold(graph, lowThreshold, highThreshold);
System.out.println("Over- and under-weight edges set to 0.0");
System.out.println("Finding first maximum weighted matching");
MaximumWeightBipartiteMatching firstMaxWeightMatching =
new MaximumWeightBipartiteMatching(graph, plateVtoCDR3Map.keySet(), plateVtoCDR1Map.keySet());
MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = firstMaxWeightMatching.getMatching();
System.out.println("First maximum weighted matching found");
//first processing run
Map<Integer, Integer> firstMatchCDR3toCDR1Map = new HashMap<>();
Iterator<DefaultWeightedEdge> weightIter = graphMatching.iterator();
DefaultWeightedEdge e;
while(weightIter.hasNext()){
e = weightIter.next();
// if(graph.getEdgeWeight(e) < lowThreshold || graph.getEdgeWeight(e) > highThreshold) {
// continue;
// }
Integer source = graph.getEdgeSource(e);
Integer target = graph.getEdgeTarget(e);
firstMatchCDR3toCDR1Map.put(plateVtoCDR3Map.get(source), plateVtoCDR1Map.get(target));
}
System.out.println("First pass matches: " + firstMatchCDR3toCDR1Map.size());
System.out.println("Removing edges from first maximum weighted matching");
//zero out the edge weights in the matching
weightIter = graphMatching.iterator();
while(weightIter.hasNext()){
graph.removeEdge(weightIter.next());
}
System.out.println("Edges removed");
//Generate a new matching
System.out.println("Finding second maximum weighted matching");
MaximumWeightBipartiteMatching secondMaxWeightMatching =
new MaximumWeightBipartiteMatching(graph, plateVtoCDR3Map.keySet(), plateVtoCDR1Map.keySet());
graphMatching = secondMaxWeightMatching.getMatching();
System.out.println("Second maximum weighted matching found");
//second processing run
Map<Integer, Integer> secondMatchCDR3toCDR1Map = new HashMap<>();
weightIter = graphMatching.iterator();
while(weightIter.hasNext()){
e = weightIter.next();
// if(graph.getEdgeWeight(e) < lowThreshold || graph.getEdgeWeight(e) > highThreshold) {
// continue;
// }
Integer source = graph.getEdgeSource(e);
// if(!(CDR3AtoBMap.containsKey(source) || CDR3BtoAMap.containsKey(source))){
// continue;
// }
Integer target = graph.getEdgeTarget(e);
secondMatchCDR3toCDR1Map.put(plateVtoCDR3Map.get(source), plateVtoCDR1Map.get(target));
}
System.out.println("Second pass matches: " + secondMatchCDR3toCDR1Map.size());
System.out.println("Mapping first pass CDR3 alpha/beta pairs");
//get linked map for first matching attempt
Map<Integer, Integer> firstMatchesMap = new LinkedHashMap<>();
for(Integer alphaCDR3: cdr3AtoBMap.keySet()) {
if (!(firstMatchCDR3toCDR1Map.containsKey(alphaCDR3))) {
continue;
}
Integer betaCDR3 = cdr3AtoBMap.get(alphaCDR3);
if (!(firstMatchCDR3toCDR1Map.containsKey(betaCDR3))) {
continue;
}
firstMatchesMap.put(alphaCDR3, firstMatchCDR3toCDR1Map.get(alphaCDR3));
firstMatchesMap.put(betaCDR3, firstMatchCDR3toCDR1Map.get(betaCDR3));
}
System.out.println("First pass CDR3 alpha/beta pairs mapped");
System.out.println("Mapping second pass CDR3 alpha/beta pairs.");
System.out.println("Finding CDR3 pairs that swapped CDR1 matches between first pass and second pass.");
//Look for matches that simply swapped already-matched alpha and beta CDR3s
Map<Integer, Integer> dualMatchesMap = new LinkedHashMap<>();
for(Integer alphaCDR3: cdr3AtoBMap.keySet()) {
if (!(firstMatchCDR3toCDR1Map.containsKey(alphaCDR3) && secondMatchCDR3toCDR1Map.containsKey(alphaCDR3))) {
continue;
}
Integer betaCDR3 = cdr3AtoBMap.get(alphaCDR3);
if (!(firstMatchCDR3toCDR1Map.containsKey(betaCDR3) && secondMatchCDR3toCDR1Map.containsKey(betaCDR3))) {
continue;
}
if(firstMatchCDR3toCDR1Map.get(alphaCDR3).equals(secondMatchCDR3toCDR1Map.get(betaCDR3))){
if(firstMatchCDR3toCDR1Map.get(betaCDR3).equals(secondMatchCDR3toCDR1Map.get(alphaCDR3))){
dualMatchesMap.put(alphaCDR3, firstMatchCDR3toCDR1Map.get(alphaCDR3));
dualMatchesMap.put(betaCDR3, firstMatchCDR3toCDR1Map.get(betaCDR3));
}
}
}
System.out.println("Second pass mapping made. Dual CDR3/CDR1 pairings found.");
Instant stop = Instant.now();
//results for first map
System.out.println("RESULTS FOR FIRST PASS MATCHING");
List<List<String>> allResults = new ArrayList<>();
Integer trueCount = 0;
Iterator iter = firstMatchesMap.keySet().iterator();
while(iter.hasNext()){
Boolean proven = false;
List<String> tmp = new ArrayList<>();
tmp.add(iter.next().toString());
tmp.add(iter.next().toString());
tmp.add(firstMatchesMap.get(Integer.valueOf(tmp.get(0))).toString());
tmp.add(firstMatchesMap.get(Integer.valueOf(tmp.get(1))).toString());
if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(2)))){
if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(3)))){
proven = true;
}
}
else if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(3)))){
if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(2)))){
proven = true;
}
}
tmp.add(proven.toString());
allResults.add(tmp);
if(proven){
trueCount++;
}
}
List<String> comments = new ArrayList<>();
comments.add("Plate size: " + samplePlate.getSize() + " wells");
comments.add("Previous pairs found: " + previousMatches.size());
comments.add("CDR1 matches attempted: " + allResults.size());
double attemptRate = (double) allResults.size() / previousMatches.size();
comments.add("Matching attempt rate: " + attemptRate);
comments.add("Number of correct matches: " + trueCount);
double correctRate = (double) trueCount / allResults.size();
comments.add("Correct matching rate: " + correctRate);
NumberFormat nf = NumberFormat.getInstance(Locale.US);
Duration time = Duration.between(start, stop);
time = time.plus(previousTime);
comments.add("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
for(String s: comments){
System.out.println(s);
}
List<String> headers = new ArrayList<>();
headers.add("CDR3 alpha");
headers.add("CDR3 beta");
headers.add("first matched CDR1");
headers.add("second matched CDR1");
headers.add("Correct match?");
MatchingResult firstTest = new MatchingResult(samplePlate.getSourceFileName(),
comments, headers, allResults, dualMatchesMap, time);
//results for dual map
System.out.println("RESULTS FOR SECOND PASS MATCHING");
allResults = new ArrayList<>();
trueCount = 0;
iter = dualMatchesMap.keySet().iterator();
while(iter.hasNext()){
Boolean proven = false;
List<String> tmp = new ArrayList<>();
tmp.add(iter.next().toString());
tmp.add(iter.next().toString());
tmp.add(dualMatchesMap.get(Integer.valueOf(tmp.get(0))).toString());
tmp.add(dualMatchesMap.get(Integer.valueOf(tmp.get(1))).toString());
if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(2)))){
if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(3)))){
proven = true;
}
}
else if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(3)))){
if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(2)))){
proven = true;
}
}
tmp.add(proven.toString());
allResults.add(tmp);
if(proven){
trueCount++;
}
}
comments = new ArrayList<>();
comments.add("Plate size: " + samplePlate.getSize() + " wells");
comments.add("Previous pairs found: " + previousMatches.size());
comments.add("High overlap threshold: " + highThreshold);
comments.add("Low overlap threshold: " + lowThreshold);
comments.add("CDR1 matches attempted: " + allResults.size());
attemptRate = (double) allResults.size() / previousMatches.size();
comments.add("Matching attempt rate: " + attemptRate);
comments.add("Number of correct matches: " + trueCount);
correctRate = (double) trueCount / allResults.size();
comments.add("Correct matching rate: " + correctRate);
comments.add("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
for(String s: comments){
System.out.println(s);
}
System.out.println("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
MatchingResult dualTest = new MatchingResult(samplePlate.getSourceFileName(), comments, headers,
allResults, dualMatchesMap, time);
MatchingResult[] output = {firstTest, dualTest};
return output; return output;
} }
//Commented out CDR1 matching until it's time to re-implement it
// //Simulated matching of CDR1s to CDR3s. Requires MatchingResult from prior run of matchCDR3s.
// public static MatchingResult[] matchCDR1s(List<Integer[]> distinctCells,
// Plate samplePlate, Integer lowThreshold,
// Integer highThreshold, MatchingResult priorResult){
// Instant start = Instant.now();
// Duration previousTime = priorResult.getTime();
// Map<Integer, Integer> previousMatches = priorResult.getMatchMap();
// int numWells = samplePlate.getSize();
// int[] cdr3Indices = {cdr3AlphaIndex, cdr3BetaIndex};
// int[] cdr1Indices = {cdr1AlphaIndex, cdr1BetaIndex};
//
// System.out.println("Making previous match maps");
// Map<Integer, Integer> cdr3AtoBMap = previousMatches;
// Map<Integer, Integer> cdr3BtoAMap = invertVertexMap(cdr3AtoBMap);
// System.out.println("Previous match maps made");
//
// System.out.println("Making cell maps");
// Map<Integer, Integer> alphaCDR3toCDR1Map = makeSequenceToSequenceMap(distinctCells, cdr3AlphaIndex, cdr1AlphaIndex);
// Map<Integer, Integer> betaCDR3toCDR1Map = makeSequenceToSequenceMap(distinctCells, cdr3BetaIndex, cdr1BetaIndex);
// System.out.println("Cell maps made");
//
// System.out.println("Making well maps");
// Map<Integer, Integer> allCDR3s = samplePlate.assayWellsSequenceS(cdr3Indices);
// Map<Integer, Integer> allCDR1s = samplePlate.assayWellsSequenceS(cdr1Indices);
// int CDR3Count = allCDR3s.size();
// System.out.println("all CDR3s count: " + CDR3Count);
// int CDR1Count = allCDR1s.size();
// System.out.println("all CDR1s count: " + CDR1Count);
// System.out.println("Well maps made");
//
// System.out.println("Removing unpaired CDR3s from well maps");
// List<Integer> unpairedCDR3s = new ArrayList<>();
// for(Integer i: allCDR3s.keySet()){
// if(!(cdr3AtoBMap.containsKey(i) || cdr3BtoAMap.containsKey(i))){
// unpairedCDR3s.add(i);
// }
// }
// for(Integer i: unpairedCDR3s){
// allCDR3s.remove(i);
// }
// System.out.println("Unpaired CDR3s removed.");
// System.out.println("Remaining CDR3 count: " + allCDR3s.size());
//
// System.out.println("Removing below-minimum-overlap-threshold and saturating-occupancy CDR1s");
// filterByOccupancyThreshold(allCDR1s, lowThreshold, numWells - 1);
// System.out.println("CDR1s removed.");
// System.out.println("Remaining CDR1 count: " + allCDR1s.size());
//
// System.out.println("Making vertex maps");
//
// //For the SimpleWeightedBipartiteGraphMatrixGenerator, all vertices must have
// // distinct numbers associated with them. Since I'm using a 2D array, that means
// // distinct indices between the rows and columns. vertexStartValue lets me track where I switch
// // from numbering rows to columns, so I can assign unique numbers to every vertex, and then
// // subtract the vertexStartValue from CDR1s to use their vertex labels as array indices
// Integer vertexStartValue = 0;
// //keys are sequential integer vertices, values are CDR3s
// Map<Integer, Integer> plateVtoCDR3Map = makeVertexToSequenceMap(allCDR3s, vertexStartValue);
// //New start value for vertex to CDR1 map should be one more than final vertex value in CDR3 map
// vertexStartValue += plateVtoCDR3Map.size();
// //keys are sequential integers vertices, values are CDR1s
// Map<Integer, Integer> plateVtoCDR1Map = makeVertexToSequenceMap(allCDR1s, vertexStartValue);
// //keys are CDR3s, values are sequential integer vertices from previous map
// Map<Integer, Integer> plateCDR3toVMap = invertVertexMap(plateVtoCDR3Map);
// //keys are CDR1s, values are sequential integer vertices from previous map
// Map<Integer, Integer> plateCDR1toVMap = invertVertexMap(plateVtoCDR1Map);
// System.out.println("Vertex maps made");
//
// System.out.println("Creating adjacency matrix");
// //Count how many wells each CDR3 appears in
// Map<Integer, Integer> cdr3WellCounts = new HashMap<>();
// //count how many wells each CDR1 appears in
// Map<Integer, Integer> cdr1WellCounts = new HashMap<>();
// //add edges, where weights are number of wells the peptides share in common.
// //If this is too slow, can make a 2d array and use the SimpleWeightedGraphMatrixGenerator class
// Map<Integer, Integer> wellNCDR3s = null;
// Map<Integer, Integer> wellNCDR1s = null;
// double[][] weights = new double[plateVtoCDR3Map.size()][plateVtoCDR1Map.size()];
// countSequencesAndFillMatrix(samplePlate, allCDR3s, allCDR1s, plateCDR3toVMap, plateCDR1toVMap,
// cdr3Indices, cdr1Indices, cdr3WellCounts, cdr1WellCounts, weights);
// System.out.println("Matrix created");
//
// System.out.println("Creating graph");
// SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph =
// new SimpleWeightedGraph<>(DefaultWeightedEdge.class);
//
// SimpleWeightedBipartiteGraphMatrixGenerator graphGenerator = new SimpleWeightedBipartiteGraphMatrixGenerator();
// List<Integer> cdr3Vertices = new ArrayList<>(plateVtoCDR3Map.keySet()); //This will work because LinkedHashMap preserves order of entry
// graphGenerator.first(cdr3Vertices);
// List<Integer> cdr1Vertices = new ArrayList<>(plateVtoCDR1Map.keySet());
// graphGenerator.second(cdr1Vertices); //This will work because LinkedHashMap preserves order of entry
// graphGenerator.weights(weights);
// graphGenerator.generateGraph(graph);
// System.out.println("Graph created");
//
// System.out.println("Removing edges outside of weight thresholds");
// filterByOccupancyThreshold(graph, lowThreshold, highThreshold);
// System.out.println("Over- and under-weight edges set to 0.0");
//
// System.out.println("Finding first maximum weighted matching");
// MaximumWeightBipartiteMatching firstMaxWeightMatching =
// new MaximumWeightBipartiteMatching(graph, plateVtoCDR3Map.keySet(), plateVtoCDR1Map.keySet());
// MatchingAlgorithm.Matching<String, DefaultWeightedEdge> graphMatching = firstMaxWeightMatching.getMatching();
// System.out.println("First maximum weighted matching found");
//
//
// //first processing run
// Map<Integer, Integer> firstMatchCDR3toCDR1Map = new HashMap<>();
// Iterator<DefaultWeightedEdge> weightIter = graphMatching.iterator();
// DefaultWeightedEdge e;
// while(weightIter.hasNext()){
// e = weightIter.next();
//// if(graph.getEdgeWeight(e) < lowThreshold || graph.getEdgeWeight(e) > highThreshold) {
//// continue;
//// }
// Integer source = graph.getEdgeSource(e);
// Integer target = graph.getEdgeTarget(e);
// firstMatchCDR3toCDR1Map.put(plateVtoCDR3Map.get(source), plateVtoCDR1Map.get(target));
// }
// System.out.println("First pass matches: " + firstMatchCDR3toCDR1Map.size());
//
// System.out.println("Removing edges from first maximum weighted matching");
// //zero out the edge weights in the matching
// weightIter = graphMatching.iterator();
// while(weightIter.hasNext()){
// graph.removeEdge(weightIter.next());
// }
// System.out.println("Edges removed");
//
// //Generate a new matching
// System.out.println("Finding second maximum weighted matching");
// MaximumWeightBipartiteMatching secondMaxWeightMatching =
// new MaximumWeightBipartiteMatching(graph, plateVtoCDR3Map.keySet(), plateVtoCDR1Map.keySet());
// graphMatching = secondMaxWeightMatching.getMatching();
// System.out.println("Second maximum weighted matching found");
//
//
// //second processing run
// Map<Integer, Integer> secondMatchCDR3toCDR1Map = new HashMap<>();
// weightIter = graphMatching.iterator();
// while(weightIter.hasNext()){
// e = weightIter.next();
//// if(graph.getEdgeWeight(e) < lowThreshold || graph.getEdgeWeight(e) > highThreshold) {
//// continue;
//// }
// Integer source = graph.getEdgeSource(e);
//// if(!(CDR3AtoBMap.containsKey(source) || CDR3BtoAMap.containsKey(source))){
//// continue;
//// }
// Integer target = graph.getEdgeTarget(e);
// secondMatchCDR3toCDR1Map.put(plateVtoCDR3Map.get(source), plateVtoCDR1Map.get(target));
// }
// System.out.println("Second pass matches: " + secondMatchCDR3toCDR1Map.size());
//
// System.out.println("Mapping first pass CDR3 alpha/beta pairs");
// //get linked map for first matching attempt
// Map<Integer, Integer> firstMatchesMap = new LinkedHashMap<>();
// for(Integer alphaCDR3: cdr3AtoBMap.keySet()) {
// if (!(firstMatchCDR3toCDR1Map.containsKey(alphaCDR3))) {
// continue;
// }
// Integer betaCDR3 = cdr3AtoBMap.get(alphaCDR3);
// if (!(firstMatchCDR3toCDR1Map.containsKey(betaCDR3))) {
// continue;
// }
// firstMatchesMap.put(alphaCDR3, firstMatchCDR3toCDR1Map.get(alphaCDR3));
// firstMatchesMap.put(betaCDR3, firstMatchCDR3toCDR1Map.get(betaCDR3));
// }
// System.out.println("First pass CDR3 alpha/beta pairs mapped");
//
// System.out.println("Mapping second pass CDR3 alpha/beta pairs.");
// System.out.println("Finding CDR3 pairs that swapped CDR1 matches between first pass and second pass.");
// //Look for matches that simply swapped already-matched alpha and beta CDR3s
// Map<Integer, Integer> dualMatchesMap = new LinkedHashMap<>();
// for(Integer alphaCDR3: cdr3AtoBMap.keySet()) {
// if (!(firstMatchCDR3toCDR1Map.containsKey(alphaCDR3) && secondMatchCDR3toCDR1Map.containsKey(alphaCDR3))) {
// continue;
// }
// Integer betaCDR3 = cdr3AtoBMap.get(alphaCDR3);
// if (!(firstMatchCDR3toCDR1Map.containsKey(betaCDR3) && secondMatchCDR3toCDR1Map.containsKey(betaCDR3))) {
// continue;
// }
// if(firstMatchCDR3toCDR1Map.get(alphaCDR3).equals(secondMatchCDR3toCDR1Map.get(betaCDR3))){
// if(firstMatchCDR3toCDR1Map.get(betaCDR3).equals(secondMatchCDR3toCDR1Map.get(alphaCDR3))){
// dualMatchesMap.put(alphaCDR3, firstMatchCDR3toCDR1Map.get(alphaCDR3));
// dualMatchesMap.put(betaCDR3, firstMatchCDR3toCDR1Map.get(betaCDR3));
// }
// }
// }
// System.out.println("Second pass mapping made. Dual CDR3/CDR1 pairings found.");
//
// Instant stop = Instant.now();
// //results for first map
// System.out.println("RESULTS FOR FIRST PASS MATCHING");
// List<List<String>> allResults = new ArrayList<>();
// Integer trueCount = 0;
// Iterator iter = firstMatchesMap.keySet().iterator();
//
// while(iter.hasNext()){
// Boolean proven = false;
// List<String> tmp = new ArrayList<>();
// tmp.add(iter.next().toString());
// tmp.add(iter.next().toString());
// tmp.add(firstMatchesMap.get(Integer.valueOf(tmp.get(0))).toString());
// tmp.add(firstMatchesMap.get(Integer.valueOf(tmp.get(1))).toString());
// if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(2)))){
// if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(3)))){
// proven = true;
// }
// }
// else if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(3)))){
// if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(2)))){
// proven = true;
// }
// }
// tmp.add(proven.toString());
// allResults.add(tmp);
// if(proven){
// trueCount++;
// }
// }
//
// List<String> comments = new ArrayList<>();
// comments.add("Plate size: " + samplePlate.getSize() + " wells");
// comments.add("Previous pairs found: " + previousMatches.size());
// comments.add("CDR1 matches attempted: " + allResults.size());
// double attemptRate = (double) allResults.size() / previousMatches.size();
// comments.add("Matching attempt rate: " + attemptRate);
// comments.add("Number of correct matches: " + trueCount);
// double correctRate = (double) trueCount / allResults.size();
// comments.add("Correct matching rate: " + correctRate);
// NumberFormat nf = NumberFormat.getInstance(Locale.US);
// Duration time = Duration.between(start, stop);
// time = time.plus(previousTime);
// comments.add("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
// for(String s: comments){
// System.out.println(s);
// }
//
//
//
// List<String> headers = new ArrayList<>();
// headers.add("CDR3 alpha");
// headers.add("CDR3 beta");
// headers.add("first matched CDR1");
// headers.add("second matched CDR1");
// headers.add("Correct match?");
//
// MatchingResult firstTest = new MatchingResult(samplePlate.getSourceFileName(),
// comments, headers, allResults, dualMatchesMap, time);
//
// //results for dual map
// System.out.println("RESULTS FOR SECOND PASS MATCHING");
// allResults = new ArrayList<>();
// trueCount = 0;
// iter = dualMatchesMap.keySet().iterator();
// while(iter.hasNext()){
// Boolean proven = false;
// List<String> tmp = new ArrayList<>();
// tmp.add(iter.next().toString());
// tmp.add(iter.next().toString());
// tmp.add(dualMatchesMap.get(Integer.valueOf(tmp.get(0))).toString());
// tmp.add(dualMatchesMap.get(Integer.valueOf(tmp.get(1))).toString());
// if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(2)))){
// if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(3)))){
// proven = true;
// }
// }
// else if(alphaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(0))).equals(Integer.valueOf(tmp.get(3)))){
// if(betaCDR3toCDR1Map.get(Integer.valueOf(tmp.get(1))).equals(Integer.valueOf(tmp.get(2)))){
// proven = true;
// }
// }
// tmp.add(proven.toString());
// allResults.add(tmp);
// if(proven){
// trueCount++;
// }
// }
//
// comments = new ArrayList<>();
// comments.add("Plate size: " + samplePlate.getSize() + " wells");
// comments.add("Previous pairs found: " + previousMatches.size());
// comments.add("High overlap threshold: " + highThreshold);
// comments.add("Low overlap threshold: " + lowThreshold);
// comments.add("CDR1 matches attempted: " + allResults.size());
// attemptRate = (double) allResults.size() / previousMatches.size();
// comments.add("Matching attempt rate: " + attemptRate);
// comments.add("Number of correct matches: " + trueCount);
// correctRate = (double) trueCount / allResults.size();
// comments.add("Correct matching rate: " + correctRate);
// comments.add("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
//
// for(String s: comments){
// System.out.println(s);
// }
//
// System.out.println("Simulation time: " + nf.format(time.toSeconds()) + " seconds");
// MatchingResult dualTest = new MatchingResult(samplePlate.getSourceFileName(), comments, headers,
// allResults, dualMatchesMap, time);
// MatchingResult[] output = {firstTest, dualTest};
// return output;
// }
//Remove sequences based on occupancy
public static void filterByOccupancyThresholds(Map<Integer, Integer> wellMap, int low, int high){
List<Integer> noise = new ArrayList<>();
for(Integer k: wellMap.keySet()){
if((wellMap.get(k) > high) || (wellMap.get(k) < low)){
noise.add(k);
}
}
for(Integer k: noise) {
wellMap.remove(k);
}
}
//Counts the well occupancy of the row peptides and column peptides into given maps, and //Counts the well occupancy of the row peptides and column peptides into given maps, and
//fills weights in the given 2D array //fills weights in the given 2D array
@@ -621,62 +679,6 @@ public class Simulator {
} }
} }
private static void filterByOccupancyThreshold(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
int low, int high) {
for(DefaultWeightedEdge e: graph.edgeSet()){
if ((graph.getEdgeWeight(e) > high) || (graph.getEdgeWeight(e) < low)){
graph.setEdgeWeight(e, 0.0);
}
}
}
private static void filterByOccupancyThreshold(Map<Integer, Integer> wellMap, int low, int high){
List<Integer> noise = new ArrayList<>();
for(Integer k: wellMap.keySet()){
if((wellMap.get(k) > high) || (wellMap.get(k) < low)){
noise.add(k);
}
}
for(Integer k: noise) {
wellMap.remove(k);
}
}
//Remove edges for pairs with large occupancy discrepancy
private static void filterByRelativeOccupancy(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer maxOccupancyDifference) {
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
Integer betaOcc = betaWellCounts.get(plateVtoBMap.get(graph.getEdgeTarget(e)));
//Adjust this to something cleverer later
if (Math.abs(alphaOcc - betaOcc) >= maxOccupancyDifference) {
graph.setEdgeWeight(e, 0.0);
}
}
}
//Remove edges for pairs where overlap size is significantly lower than the well occupancy
private static void filterByOverlapSize(SimpleWeightedGraph<Integer, DefaultWeightedEdge> graph,
Map<Integer, Integer> alphaWellCounts,
Map<Integer, Integer> betaWellCounts,
Map<Integer, Integer> plateVtoAMap,
Map<Integer, Integer> plateVtoBMap,
Integer minOverlapPercent) {
for (DefaultWeightedEdge e : graph.edgeSet()) {
Integer alphaOcc = alphaWellCounts.get(plateVtoAMap.get(graph.getEdgeSource(e)));
Integer betaOcc = betaWellCounts.get(plateVtoBMap.get(graph.getEdgeTarget(e)));
double weight = graph.getEdgeWeight(e);
double min = minOverlapPercent / 100.0;
if ((weight / alphaOcc < min) || (weight / betaOcc < min)) {
graph.setEdgeWeight(e, 0.0);
}
}
}
private static Map<Integer, Integer> makeSequenceToSequenceMap(List<Integer[]> cells, int keySequenceIndex, private static Map<Integer, Integer> makeSequenceToSequenceMap(List<Integer[]> cells, int keySequenceIndex,
int valueSequenceIndex){ int valueSequenceIndex){
Map<Integer, Integer> keySequenceToValueSequenceMap = new HashMap<>(); Map<Integer, Integer> keySequenceToValueSequenceMap = new HashMap<>();

View File

@@ -1,694 +0,0 @@
import org.apache.commons.cli.*;
import java.io.IOException;
import java.util.List;
import java.util.Scanner;
import java.util.InputMismatchException;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
//
public class UserInterface {
final static Scanner sc = new Scanner(System.in);
static int input;
static boolean quit = false;
public static void main(String[] args) {
//for now, commenting out all the command line argument stuff.
// Refactoring to output files of graphs, so it would all need to change anyway.
// if(args.length != 0){
// //These command line options are a big mess
// //Really, I don't think command line tools are expected to work in this many different modes
// //making cells, making plates, and matching are the sort of thing that UNIX philosophy would say
// //should be three separate programs.
// //There might be a way to do it with option parameters?
//
// Options mainOptions = new Options();
// Option makeCells = Option.builder("cells")
// .longOpt("make-cells")
// .desc("Makes a file of distinct cells")
// .build();
// Option makePlate = Option.builder("plates")
// .longOpt("make-plates")
// .desc("Makes a sample plate file")
// .build();
// Option matchCDR3 = Option.builder("match")
// .longOpt("match-cdr3")
// .desc("Match CDR3s. Requires a cell sample file and any number of plate files.")
// .build();
// OptionGroup mainGroup = new OptionGroup();
// mainGroup.addOption(makeCells);
// mainGroup.addOption(makePlate);
// mainGroup.addOption(matchCDR3);
// mainGroup.setRequired(true);
// mainOptions.addOptionGroup(mainGroup);
//
// //Reuse clones of this for other options groups, rather than making it lots of times
// Option outputFile = Option.builder("o")
// .longOpt("output-file")
// .hasArg()
// .argName("filename")
// .desc("Name of output file")
// .build();
// mainOptions.addOption(outputFile);
//
// //Options cellOptions = new Options();
// Option numCells = Option.builder("nc")
// .longOpt("num-cells")
// .desc("The number of distinct cells to generate")
// .hasArg()
// .argName("number")
// .build();
// mainOptions.addOption(numCells);
// Option cdr1Freq = Option.builder("d")
// .longOpt("peptide-diversity-factor")
// .hasArg()
// .argName("number")
// .desc("Number of distinct CDR3s for every CDR1")
// .build();
// mainOptions.addOption(cdr1Freq);
// //Option cellOutput = (Option) outputFile.clone();
// //cellOutput.setRequired(true);
// //mainOptions.addOption(cellOutput);
//
// //Options plateOptions = new Options();
// Option inputCells = Option.builder("c")
// .longOpt("cell-file")
// .hasArg()
// .argName("file")
// .desc("The cell sample file used for filling wells")
// .build();
// mainOptions.addOption(inputCells);
// Option numWells = Option.builder("w")
// .longOpt("num-wells")
// .hasArg()
// .argName("number")
// .desc("The number of wells on each plate")
// .build();
// mainOptions.addOption(numWells);
// Option numPlates = Option.builder("np")
// .longOpt("num-plates")
// .hasArg()
// .argName("number")
// .desc("The number of plate files to output")
// .build();
// mainOptions.addOption(numPlates);
// //Option plateOutput = (Option) outputFile.clone();
// //plateOutput.setRequired(true);
// //plateOutput.setDescription("Prefix for plate output filenames");
// //mainOptions.addOption(plateOutput);
// Option plateErr = Option.builder("err")
// .longOpt("drop-out-rate")
// .hasArg()
// .argName("number")
// .desc("Well drop-out rate. (Probability between 0 and 1)")
// .build();
// mainOptions.addOption(plateErr);
// Option plateConcentrations = Option.builder("t")
// .longOpt("t-cells-per-well")
// .hasArgs()
// .argName("number 1, number 2, ...")
// .desc("Number of T cells per well for each plate section")
// .build();
// mainOptions.addOption(plateConcentrations);
//
////different distributions, mutually exclusive
// OptionGroup plateDistributions = new OptionGroup();
// Option plateExp = Option.builder("exponential")
// .desc("Sample from distinct cells with exponential frequency distribution")
// .build();
// plateDistributions.addOption(plateExp);
// Option plateGaussian = Option.builder("gaussian")
// .desc("Sample from distinct cells with gaussain frequency distribution")
// .build();
// plateDistributions.addOption(plateGaussian);
// Option platePoisson = Option.builder("poisson")
// .desc("Sample from distinct cells with poisson frequency distribution")
// .build();
// plateDistributions.addOption(platePoisson);
// mainOptions.addOptionGroup(plateDistributions);
//
// Option plateStdDev = Option.builder("stddev")
// .desc("Standard deviation for gaussian distribution")
// .hasArg()
// .argName("number")
// .build();
// mainOptions.addOption(plateStdDev);
//
// Option plateLambda = Option.builder("lambda")
// .desc("Lambda for exponential distribution")
// .hasArg()
// .argName("number")
// .build();
// mainOptions.addOption(plateLambda);
//
//
//
////
//// String cellFile, String filename, Double stdDev,
//// Integer numWells, Integer numSections,
//// Integer[] concentrations, Double dropOutRate
////
//
// //Options matchOptions = new Options();
// inputCells.setDescription("The cell sample file to be used for matching.");
// mainOptions.addOption(inputCells);
// Option lowThresh = Option.builder("low")
// .longOpt("low-threshold")
// .hasArg()
// .argName("number")
// .desc("Sets the minimum occupancy overlap to attempt matching")
// .build();
// mainOptions.addOption(lowThresh);
// Option highThresh = Option.builder("high")
// .longOpt("high-threshold")
// .hasArg()
// .argName("number")
// .desc("Sets the maximum occupancy overlap to attempt matching")
// .build();
// mainOptions.addOption(highThresh);
// Option occDiff = Option.builder("occdiff")
// .longOpt("occupancy-difference")
// .hasArg()
// .argName("Number")
// .desc("Maximum difference in alpha/beta occupancy to attempt matching")
// .build();
// mainOptions.addOption(occDiff);
// Option overlapPer = Option.builder("ovper")
// .longOpt("overlap-percent")
// .hasArg()
// .argName("Percent")
// .desc("Minimum overlap percent to attempt matching (0 -100)")
// .build();
// mainOptions.addOption(overlapPer);
// Option inputPlates = Option.builder("p")
// .longOpt("plate-files")
// .hasArgs()
// .desc("Plate files to match")
// .build();
// mainOptions.addOption(inputPlates);
//
//
//
// CommandLineParser parser = new DefaultParser();
// try {
// CommandLine line = parser.parse(mainOptions, args);
// if(line.hasOption("match")){
// //line = parser.parse(mainOptions, args);
// String cellFile = line.getOptionValue("c");
// Integer lowThreshold = Integer.valueOf(line.getOptionValue(lowThresh));
// Integer highThreshold = Integer.valueOf(line.getOptionValue(highThresh));
// Integer occupancyDifference = Integer.valueOf(line.getOptionValue(occDiff));
// Integer overlapPercent = Integer.valueOf(line.getOptionValue(overlapPer));
// for(String plate: line.getOptionValues("p")) {
// matchCDR3s(cellFile, plate, lowThreshold, highThreshold, occupancyDifference, overlapPercent);
// }
// }
// else if(line.hasOption("cells")){
// //line = parser.parse(mainOptions, args);
// String filename = line.getOptionValue("o");
// Integer numDistCells = Integer.valueOf(line.getOptionValue("nc"));
// Integer freq = Integer.valueOf(line.getOptionValue("d"));
// makeCells(filename, numDistCells, freq);
// }
// else if(line.hasOption("plates")){
// //line = parser.parse(mainOptions, args);
// String cellFile = line.getOptionValue("c");
// String filenamePrefix = line.getOptionValue("o");
// Integer numWellsOnPlate = Integer.valueOf(line.getOptionValue("w"));
// Integer numPlatesToMake = Integer.valueOf(line.getOptionValue("np"));
// String[] concentrationsToUseString = line.getOptionValues("t");
// Integer numSections = concentrationsToUseString.length;
//
// Integer[] concentrationsToUse = new Integer[numSections];
// for(int i = 0; i <numSections; i++){
// concentrationsToUse[i] = Integer.valueOf(concentrationsToUseString[i]);
// }
// Double dropOutRate = Double.valueOf(line.getOptionValue("err"));
// if(line.hasOption("exponential")){
// Double lambda = Double.valueOf(line.getOptionValue("lambda"));
// for(int i = 1; i <= numPlatesToMake; i++){
// makePlateExp(cellFile, filenamePrefix + i, lambda, numWellsOnPlate,
// concentrationsToUse,dropOutRate);
// }
// }
// else if(line.hasOption("gaussian")){
// Double stdDev = Double.valueOf(line.getOptionValue("std-dev"));
// for(int i = 1; i <= numPlatesToMake; i++){
// makePlate(cellFile, filenamePrefix + i, stdDev, numWellsOnPlate,
// concentrationsToUse,dropOutRate);
// }
//
// }
// else if(line.hasOption("poisson")){
// for(int i = 1; i <= numPlatesToMake; i++){
// makePlatePoisson(cellFile, filenamePrefix + i, numWellsOnPlate,
// concentrationsToUse,dropOutRate);
// }
// }
// }
// }
// catch (ParseException exp) {
// System.err.println("Parsing failed. Reason: " + exp.getMessage());
// }
// }
// else {
while (!quit) {
System.out.println();
System.out.println("--------BiGPairSEQ SIMULATOR--------");
System.out.println("ALPHA/BETA T CELL RECEPTOR MATCHING");
System.out.println(" USING WEIGHTED BIPARTITE GRAPHS ");
System.out.println("------------------------------------");
System.out.println("Please select an option:");
System.out.println("1) Generate a population of distinct cells");
System.out.println("2) Generate a sample plate of T cells");
System.out.println("3) Generate CDR3 alpha/beta occupancy data and overlap graph");
System.out.println("4) Simulate bipartite graph CDR3 alpha/beta matching (BiGpairSEQ)");
//Need to re-do the CDR3/CDR1 matching to correspond to new pattern
//System.out.println("5) Generate CDR3/CDR1 occupancy graph");
//System.out.println("6) Simulate CDR3/CDR1 T cell matching");
System.out.println("9) About/Acknowledgments");
System.out.println("0) Exit");
try {
input = sc.nextInt();
switch (input) {
case 1 -> makeCells();
case 2 -> makePlate();
case 3 -> makeCDR3Graph();
case 4 -> matchCDR3s();
//case 6 -> matchCellsCDR1();
case 9 -> acknowledge();
case 0 -> quit = true;
default -> throw new InputMismatchException("Invalid input.");
}
} catch (InputMismatchException | IOException ex) {
System.out.println(ex);
sc.next();
}
}
sc.close();
// }
}
private static void makeCells() {
String filename = null;
Integer numCells = 0;
Integer cdr1Freq = 1;
try {
System.out.println("\nSimulated T-Cells consist of integer values representing:\n" +
"* a pair of alpha and beta CDR3 peptides (unique within simulated population)\n" +
"* a pair of alpha and beta CDR1 peptides (not necessarily unique).");
System.out.println("\nThe cells will be written to a CSV file.");
System.out.print("Please enter a file name: ");
filename = sc.next();
System.out.println("\nCDR3 sequences are more diverse than CDR1 sequences.");
System.out.println("Please enter the factor by which distinct CDR3s outnumber CDR1s: ");
cdr1Freq = sc.nextInt();
System.out.print("\nPlease enter the number of T-cells to generate: ");
numCells = sc.nextInt();
if(numCells <= 0){
throw new InputMismatchException("Number of cells must be a positive integer.");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
CellSample sample = Simulator.generateCellSample(numCells, cdr1Freq);
assert filename != null;
CellFileWriter writer = new CellFileWriter(filename, sample);
writer.writeCellsToFile();
System.gc();
}
// //for calling from command line
// private static void makeCells(String filename, Integer numCells, Integer cdr1Freq){
// CellSample sample = Simulator.generateCellSample(numCells, cdr1Freq);
// CellFileWriter writer = new CellFileWriter(filename, sample);
// writer.writeCellsToFile();
// }
//
// private static void makePlateExp(String cellFile, String filename, Double lambda,
// Integer numWells, Integer[] concentrations, Double dropOutRate){
// CellFileReader cellReader = new CellFileReader(cellFile);
// Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
// samplePlate.fillWellsExponential(cellReader.getFilename(), cellReader.getCells(), lambda);
// PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
// writer.writePlateFile();
// }
//
// private static void makePlatePoisson(String cellFile, String filename, Integer numWells,
// Integer[] concentrations, Double dropOutRate){
// CellFileReader cellReader = new CellFileReader(cellFile);
// Double stdDev = Math.sqrt(cellReader.getCellCount());
// Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
// samplePlate.fillWells(cellReader.getFilename(), cellReader.getCells(), stdDev);
// PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
// writer.writePlateFile();
// }
//
// private static void makePlate(String cellFile, String filename, Double stdDev,
// Integer numWells, Integer[] concentrations, Double dropOutRate){
// CellFileReader cellReader = new CellFileReader(cellFile);
// Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
// samplePlate.fillWells(cellReader.getFilename(), cellReader.getCells(), stdDev);
// PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
// writer.writePlateFile();
// }
//Output a CSV of sample plate
private static void makePlate() {
String cellFile = null;
String filename = null;
Double stdDev = 0.0;
Integer numWells = 0;
Integer numSections;
Integer[] concentrations = {1};
Double dropOutRate = 0.0;
boolean poisson = false;
boolean exponential = false;
double lambda = 1.5;
try {
System.out.println("\nSimulated sample plates consist of:");
System.out.println("* a number of wells");
System.out.println(" * separated into one or more sections");
System.out.println(" * each of which has a set quantity of cells per well");
System.out.println(" * selected from a statistical distribution of distinct cells");
System.out.println(" * with a set dropout rate for individual sequences within a cell");
System.out.println("\nMaking a sample plate requires a population of distinct cells");
System.out.print("Please enter name of an existing cell sample file: ");
cellFile = sc.next();
System.out.println("\nThe sample plate will be written to a CSV file");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
System.out.println("\nSelect T-cell frequency distribution function");
System.out.println("1) Poisson");
System.out.println("2) Gaussian");
System.out.println("3) Exponential");
System.out.println("(Note: approximate distribution in original paper is exponential, lambda = 0.6)");
System.out.println("(lambda value approximated from slope of log-log graph in figure 4c)");
System.out.println("(Note: wider distributions are more memory intensive to match)");
System.out.print("Enter selection value: ");
input = sc.nextInt();
switch (input) {
case 1 -> poisson = true;
case 2 -> {
System.out.println("How many distinct T-cells within one standard deviation of peak frequency?");
System.out.println("(Note: wider distributions are more memory intensive to match)");
stdDev = sc.nextDouble();
if (stdDev <= 0.0) {
throw new InputMismatchException("Value must be positive.");
}
}
case 3 -> {
exponential = true;
System.out.println("Please enter lambda value for exponential distribution.");
lambda = sc.nextDouble();
if (lambda <= 0.0) {
throw new InputMismatchException("Value must be positive.");
}
}
default -> {
System.out.println("Invalid input. Defaulting to exponential.");
exponential = true;
}
}
System.out.print("\nNumber of wells on plate: ");
numWells = sc.nextInt();
if(numWells < 1){
throw new InputMismatchException("No wells on plate");
}
System.out.println("\nThe plate can be evenly sectioned to allow multiple concentrations of T-cells/well");
System.out.println("How many sections would you like to make (minimum 1)?");
numSections = sc.nextInt();
if(numSections < 1) {
throw new InputMismatchException("Too few sections.");
}
else if (numSections > numWells) {
throw new InputMismatchException("Cannot have more sections than wells.");
}
int i = 1;
concentrations = new Integer[numSections];
while(numSections > 0) {
System.out.print("Enter number of T-cells per well in section " + i +": ");
concentrations[i - 1] = sc.nextInt();
i++;
numSections--;
}
System.out.println("\nErrors in amplification can induce a well dropout rate for sequences");
System.out.print("Enter well dropout rate (0.0 to 1.0): ");
dropOutRate = sc.nextDouble();
if(dropOutRate < 0.0 || dropOutRate > 1.0) {
throw new InputMismatchException("The well dropout rate must be in the range [0.0, 1.0]");
}
}catch(InputMismatchException ex){
System.out.println(ex);
sc.next();
}
System.out.println("Reading Cell Sample file: " + cellFile);
assert cellFile != null;
CellFileReader cellReader = new CellFileReader(cellFile);
if(exponential){
Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
samplePlate.fillWellsExponential(cellReader.getFilename(), cellReader.getCells(), lambda);
PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
writer.writePlateFile();
}
else {
if (poisson) {
stdDev = Math.sqrt(cellReader.getCellCount()); //gaussian with square root of elements approximates poisson
}
Plate samplePlate = new Plate(numWells, dropOutRate, concentrations);
samplePlate.fillWells(cellReader.getFilename(), cellReader.getCells(), stdDev);
assert filename != null;
PlateFileWriter writer = new PlateFileWriter(filename, samplePlate);
System.out.println("Writing Sample Plate to file");
writer.writePlateFile();
System.out.println("Sample Plate written to file: " + filename);
System.gc();
}
}
//Output serialized binary of GraphAndMapData object
private static void makeCDR3Graph() {
String filename = null;
String cellFile = null;
String plateFile = null;
try {
String str = "\nGenerating bipartite weighted graph encoding occupancy overlap data ";
str = str.concat("\nrequires a cell sample file and a sample plate file.");
System.out.println(str);
System.out.print("\nPlease enter name of an existing cell sample file: ");
cellFile = sc.next();
System.out.print("\nPlease enter name of an existing sample plate file: ");
plateFile = sc.next();
System.out.println("\nThe graph and occupancy data will be written to a serialized binary file.");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
System.out.println("Reading Cell Sample file: " + cellFile);
assert cellFile != null;
CellFileReader cellReader = new CellFileReader(cellFile);
System.out.println("Reading Sample Plate file: " + plateFile);
assert plateFile != null;
PlateFileReader plateReader = new PlateFileReader(plateFile);
Plate plate = new Plate(plateReader.getFilename(), plateReader.getWells());
if (cellReader.getCells().size() == 0){
System.out.println("No cell sample found.");
System.out.println("Returning to main menu.");
}
else if(plate.getWells().size() == 0 || plate.getConcentrations().length == 0){
System.out.println("No sample plate found.");
System.out.println("Returning to main menu.");
}
else{
List<Integer[]> cells = cellReader.getCells();
GraphWithMapData data = Simulator.makeGraph(cells, plate, true);
assert filename != null;
GraphDataObjectWriter dataWriter = new GraphDataObjectWriter(filename, data);
System.out.println("Writing graph and occupancy data to file. This may take some time.");
System.out.println("File I/O time is not included in results.");
dataWriter.writeDataToFile();
System.out.println("Graph and Data file written to: " + filename);
System.gc();
}
}
//Simulate matching and output CSV file of results
private static void matchCDR3s() throws IOException {
String filename = null;
String dataFilename = null;
Integer lowThreshold = 0;
Integer highThreshold = Integer.MAX_VALUE;
Integer maxOccupancyDiff = Integer.MAX_VALUE;
Integer minOverlapPercent = 0;
try {
System.out.println("\nBiGpairSEQ simulation requires an occupancy data and overlap graph file");
System.out.println("Please enter name of an existing graph and occupancy data file: ");
dataFilename = sc.next();
System.out.println("The matching results will be written to a file.");
System.out.print("Please enter a name for the output file: ");
filename = sc.next();
System.out.println("\nWhat is the minimum number of CDR3 alpha/beta overlap wells to attempt matching?");
lowThreshold = sc.nextInt();
if(lowThreshold < 1){
throw new InputMismatchException("Minimum value for low threshold set to 1");
}
System.out.println("\nWhat is the maximum number of CDR3 alpha/beta overlap wells to attempt matching?");
highThreshold = sc.nextInt();
System.out.println("\nWhat is the maximum difference in alpha/beta occupancy to attempt matching?");
maxOccupancyDiff = sc.nextInt();
System.out.println("\nWell overlap percentage = pair overlap / sequence occupancy");
System.out.println("What is the minimum well overlap percentage to attempt matching? (0 to 100)");
minOverlapPercent = sc.nextInt();
if (minOverlapPercent < 0 || minOverlapPercent > 100) {
throw new InputMismatchException("Value outside range. Minimum percent set to 0");
}
} catch (InputMismatchException ex) {
System.out.println(ex);
sc.next();
}
//read object data from file
System.out.println("Reading graph data from file. This may take some time");
System.out.println("File I/O time is not included in results");
assert dataFilename != null;
GraphDataObjectReader dataReader = new GraphDataObjectReader(dataFilename);
GraphWithMapData data = dataReader.getData();
//set source file name
data.setSourceFilename(dataFilename);
//simulate matching
MatchingResult results = Simulator.matchCDR3s(data, dataFilename, lowThreshold, highThreshold, maxOccupancyDiff,
minOverlapPercent, true);
//write results to file
assert filename != null;
MatchingFileWriter writer = new MatchingFileWriter(filename, results);
System.out.println("Writing results to file");
writer.writeResultsToFile();
System.out.println("Results written to file: " + filename);
System.gc();
}
///////
//Rewrite this to fit new matchCDR3 method with file I/O
///////
// public static void matchCellsCDR1(){
// /*
// The idea here is that we'll get the CDR3 alpha/beta matches first. Then we'll try to match CDR3s to CDR1s by
// looking at the top two matches for each CDR3. If CDR3s in the same cell simply swap CDR1s, we assume a correct
// match
// */
// String filename = null;
// String preliminaryResultsFilename = null;
// String cellFile = null;
// String plateFile = null;
// Integer lowThresholdCDR3 = 0;
// Integer highThresholdCDR3 = Integer.MAX_VALUE;
// Integer maxOccupancyDiffCDR3 = 96; //no filtering if max difference is all wells by default
// Integer minOverlapPercentCDR3 = 0; //no filtering if min percentage is zero by default
// Integer lowThresholdCDR1 = 0;
// Integer highThresholdCDR1 = Integer.MAX_VALUE;
// boolean outputCDR3Matches = false;
// try {
// System.out.println("\nSimulated experiment requires a cell sample file and a sample plate file.");
// System.out.print("Please enter name of an existing cell sample file: ");
// cellFile = sc.next();
// System.out.print("Please enter name of an existing sample plate file: ");
// plateFile = sc.next();
// System.out.println("The matching results will be written to a file.");
// System.out.print("Please enter a name for the output file: ");
// filename = sc.next();
// System.out.println("What is the minimum number of CDR3 alpha/beta overlap wells to attempt matching?");
// lowThresholdCDR3 = sc.nextInt();
// if(lowThresholdCDR3 < 1){
// throw new InputMismatchException("Minimum value for low threshold is 1");
// }
// System.out.println("What is the maximum number of CDR3 alpha/beta overlap wells to attempt matching?");
// highThresholdCDR3 = sc.nextInt();
// System.out.println("What is the maximum difference in CDR3 alpha/beta occupancy to attempt matching?");
// maxOccupancyDiffCDR3 = sc.nextInt();
// System.out.println("What is the minimum CDR3 overlap percentage to attempt matching? (0 - 100)");
// minOverlapPercentCDR3 = sc.nextInt();
// if (minOverlapPercentCDR3 < 0 || minOverlapPercentCDR3 > 100) {
// throw new InputMismatchException("Value outside range. Minimum percent set to 0");
// }
// System.out.println("What is the minimum number of CDR3/CDR1 overlap wells to attempt matching?");
// lowThresholdCDR1 = sc.nextInt();
// if(lowThresholdCDR1 < 1){
// throw new InputMismatchException("Minimum value for low threshold is 1");
// }
// System.out.println("What is the maximum number of CDR3/CDR1 overlap wells to attempt matching?");
// highThresholdCDR1 = sc.nextInt();
// System.out.println("Matching CDR3s to CDR1s requires first matching CDR3 alpha/betas.");
// System.out.println("Output a file for CDR3 alpha/beta match results as well?");
// System.out.print("Please enter y/n: ");
// String ans = sc.next();
// Pattern pattern = Pattern.compile("(?:yes|y)", Pattern.CASE_INSENSITIVE);
// Matcher matcher = pattern.matcher(ans);
// if(matcher.matches()){
// outputCDR3Matches = true;
// System.out.println("Please enter filename for CDR3 alpha/beta match results");
// preliminaryResultsFilename = sc.next();
// System.out.println("CDR3 alpha/beta matches will be output to file");
// }
// else{
// System.out.println("CDR3 alpha/beta matches will not be output to file");
// }
// } catch (InputMismatchException ex) {
// System.out.println(ex);
// sc.next();
// }
// CellFileReader cellReader = new CellFileReader(cellFile);
// PlateFileReader plateReader = new PlateFileReader(plateFile);
// Plate plate = new Plate(plateReader.getFilename(), plateReader.getWells());
// if (cellReader.getCells().size() == 0){
// System.out.println("No cell sample found.");
// System.out.println("Returning to main menu.");
// }
// else if(plate.getWells().size() == 0){
// System.out.println("No sample plate found.");
// System.out.println("Returning to main menu.");
//
// }
// else{
// if(highThresholdCDR3 >= plate.getSize()){
// highThresholdCDR3 = plate.getSize() - 1;
// }
// if(highThresholdCDR1 >= plate.getSize()){
// highThresholdCDR1 = plate.getSize() - 1;
// }
// List<Integer[]> cells = cellReader.getCells();
// MatchingResult preliminaryResults = Simulator.matchCDR3s(cells, plate, lowThresholdCDR3, highThresholdCDR3,
// maxOccupancyDiffCDR3, minOverlapPercentCDR3, true);
// MatchingResult[] results = Simulator.matchCDR1s(cells, plate, lowThresholdCDR1,
// highThresholdCDR1, preliminaryResults);
// MatchingFileWriter writer = new MatchingFileWriter(filename + "_FirstPass", results[0]);
// writer.writeResultsToFile();
// writer = new MatchingFileWriter(filename + "_SecondPass", results[1]);
// writer.writeResultsToFile();
// if(outputCDR3Matches){
// writer = new MatchingFileWriter(preliminaryResultsFilename, preliminaryResults);
// writer.writeResultsToFile();
// }
// }
// }
private static void acknowledge(){
System.out.println("This program simulates BiGpairSEQ, a graph theory based adaptation");
System.out.println("of the pairSEQ algorithm for pairing T cell receptor sequences.");
System.out.println();
System.out.println("For full documentation, view readme.md file distributed with this code");
System.out.println("or visit https://gitea.ejsf.synology.me/efischer/BiGpairSEQ.");
System.out.println();
System.out.println("pairSEQ citation:");
System.out.println("Howie, B., Sherwood, A. M., et. al.");
System.out.println("High-throughput pairing of T cell receptor alpha and beta sequences.");
System.out.println("Sci. Transl. Med. 7, 301ra131 (2015)");
System.out.println();
System.out.println("BiGpairSEQ_Sim by Eugene Fischer, 2021-2022");
}
}