@Generated(value="com.amazonaws:awsjavasdkcodegenerator") public class FindMatchesMetrics extends Object implements Serializable, Cloneable, StructuredPojo
The evaluation metrics for the find matches algorithm. The quality of your machine learning transform is measured by getting your transform to predict some matches and comparing the results to known matches from the same dataset. The quality metrics are based on a subset of your data, so they are not precise.
Constructor and Description 

FindMatchesMetrics() 
Modifier and Type  Method and Description 

FindMatchesMetrics 
clone() 
boolean 
equals(Object obj) 
Double 
getAreaUnderPRCurve()
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the
transform, that is independent of the choice made for precision vs.

List<ColumnImportance> 
getColumnImportances()
A list of
ColumnImportance structures containing column importance metrics, sorted in order of
descending importance. 
ConfusionMatrix 
getConfusionMatrix()
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is
making.

Double 
getF1()
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

Double 
getPrecision()
The precision metric indicates when often your transform is correct when it predicts a match.

Double 
getRecall()
The recall metric indicates that for an actual match, how often your transform predicts the match.

int 
hashCode() 
void 
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . 
void 
setAreaUnderPRCurve(Double areaUnderPRCurve)
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the
transform, that is independent of the choice made for precision vs.

void 
setColumnImportances(Collection<ColumnImportance> columnImportances)
A list of
ColumnImportance structures containing column importance metrics, sorted in order of
descending importance. 
void 
setConfusionMatrix(ConfusionMatrix confusionMatrix)
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is
making.

void 
setF1(Double f1)
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

void 
setPrecision(Double precision)
The precision metric indicates when often your transform is correct when it predicts a match.

void 
setRecall(Double recall)
The recall metric indicates that for an actual match, how often your transform predicts the match.

String 
toString()
Returns a string representation of this object.

FindMatchesMetrics 
withAreaUnderPRCurve(Double areaUnderPRCurve)
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the
transform, that is independent of the choice made for precision vs.

FindMatchesMetrics 
withColumnImportances(Collection<ColumnImportance> columnImportances)
A list of
ColumnImportance structures containing column importance metrics, sorted in order of
descending importance. 
FindMatchesMetrics 
withColumnImportances(ColumnImportance... columnImportances)
A list of
ColumnImportance structures containing column importance metrics, sorted in order of
descending importance. 
FindMatchesMetrics 
withConfusionMatrix(ConfusionMatrix confusionMatrix)
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is
making.

FindMatchesMetrics 
withF1(Double f1)
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.

FindMatchesMetrics 
withPrecision(Double precision)
The precision metric indicates when often your transform is correct when it predicts a match.

FindMatchesMetrics 
withRecall(Double recall)
The recall metric indicates that for an actual match, how often your transform predicts the match.

public void setAreaUnderPRCurve(Double areaUnderPRCurve)
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.
For more information, see Precision and recall in Wikipedia.
areaUnderPRCurve
 The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the
transform, that is independent of the choice made for precision vs. recall. Higher values indicate that
you have a more attractive precision vs. recall tradeoff.
For more information, see Precision and recall in Wikipedia.
public Double getAreaUnderPRCurve()
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.
For more information, see Precision and recall in Wikipedia.
For more information, see Precision and recall in Wikipedia.
public FindMatchesMetrics withAreaUnderPRCurve(Double areaUnderPRCurve)
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.
For more information, see Precision and recall in Wikipedia.
areaUnderPRCurve
 The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the
transform, that is independent of the choice made for precision vs. recall. Higher values indicate that
you have a more attractive precision vs. recall tradeoff.
For more information, see Precision and recall in Wikipedia.
public void setPrecision(Double precision)
The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.
For more information, see Precision and recall in Wikipedia.
precision
 The precision metric indicates when often your transform is correct when it predicts a match.
Specifically, it measures how well the transform finds true positives from the total true positives
possible.
For more information, see Precision and recall in Wikipedia.
public Double getPrecision()
The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.
For more information, see Precision and recall in Wikipedia.
For more information, see Precision and recall in Wikipedia.
public FindMatchesMetrics withPrecision(Double precision)
The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.
For more information, see Precision and recall in Wikipedia.
precision
 The precision metric indicates when often your transform is correct when it predicts a match.
Specifically, it measures how well the transform finds true positives from the total true positives
possible.
For more information, see Precision and recall in Wikipedia.
public void setRecall(Double recall)
The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.
For more information, see Precision and recall in Wikipedia.
recall
 The recall metric indicates that for an actual match, how often your transform predicts the match.
Specifically, it measures how well the transform finds true positives from the total records in the source
data.
For more information, see Precision and recall in Wikipedia.
public Double getRecall()
The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.
For more information, see Precision and recall in Wikipedia.
For more information, see Precision and recall in Wikipedia.
public FindMatchesMetrics withRecall(Double recall)
The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.
For more information, see Precision and recall in Wikipedia.
recall
 The recall metric indicates that for an actual match, how often your transform predicts the match.
Specifically, it measures how well the transform finds true positives from the total records in the source
data.
For more information, see Precision and recall in Wikipedia.
public void setF1(Double f1)
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.
For more information, see F1 score in Wikipedia.
f1
 The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best
accuracy.
For more information, see F1 score in Wikipedia.
public Double getF1()
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.
For more information, see F1 score in Wikipedia.
For more information, see F1 score in Wikipedia.
public FindMatchesMetrics withF1(Double f1)
The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best accuracy.
For more information, see F1 score in Wikipedia.
f1
 The maximum F1 metric indicates the transform's accuracy between 0 and 1, where 1 is the best
accuracy.
For more information, see F1 score in Wikipedia.
public void setConfusionMatrix(ConfusionMatrix confusionMatrix)
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
For more information, see Confusion matrix in Wikipedia.
confusionMatrix
 The confusion matrix shows you what your transform is predicting accurately and what types of errors it is
making.
For more information, see Confusion matrix in Wikipedia.
public ConfusionMatrix getConfusionMatrix()
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
For more information, see Confusion matrix in Wikipedia.
For more information, see Confusion matrix in Wikipedia.
public FindMatchesMetrics withConfusionMatrix(ConfusionMatrix confusionMatrix)
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
For more information, see Confusion matrix in Wikipedia.
confusionMatrix
 The confusion matrix shows you what your transform is predicting accurately and what types of errors it is
making.
For more information, see Confusion matrix in Wikipedia.
public List<ColumnImportance> getColumnImportances()
A list of ColumnImportance
structures containing column importance metrics, sorted in order of
descending importance.
ColumnImportance
structures containing column importance metrics, sorted in order
of descending importance.public void setColumnImportances(Collection<ColumnImportance> columnImportances)
A list of ColumnImportance
structures containing column importance metrics, sorted in order of
descending importance.
columnImportances
 A list of ColumnImportance
structures containing column importance metrics, sorted in order
of descending importance.public FindMatchesMetrics withColumnImportances(ColumnImportance... columnImportances)
A list of ColumnImportance
structures containing column importance metrics, sorted in order of
descending importance.
NOTE: This method appends the values to the existing list (if any). Use
setColumnImportances(java.util.Collection)
or withColumnImportances(java.util.Collection)
if
you want to override the existing values.
columnImportances
 A list of ColumnImportance
structures containing column importance metrics, sorted in order
of descending importance.public FindMatchesMetrics withColumnImportances(Collection<ColumnImportance> columnImportances)
A list of ColumnImportance
structures containing column importance metrics, sorted in order of
descending importance.
columnImportances
 A list of ColumnImportance
structures containing column importance metrics, sorted in order
of descending importance.public String toString()
toString
in class Object
Object.toString()
public FindMatchesMetrics clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
 Implementation of ProtocolMarshaller
used to marshall this object's data.