@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class HumanEvaluationConfig extends Object implements Serializable, Cloneable, StructuredPojo
Specifies the custom metrics, how tasks will be rated, the flow definition ARN, and your custom prompt datasets. Model evaluation jobs use human workers only support the use of custom prompt datasets. To learn more about custom prompt datasets and the required format, see Custom prompt datasets.
When you create custom metrics in HumanEvaluationCustomMetric
you must specify the metric's
name
. The list of names
specified in the HumanEvaluationCustomMetric
array,
must match the metricNames
array of strings specified in EvaluationDatasetMetricConfig
. For
example, if in the HumanEvaluationCustomMetric
array your specified the names
"accuracy", "toxicity", "readability"
as custom metrics then the metricNames
array
would need to look like the following ["accuracy", "toxicity", "readability"]
in
EvaluationDatasetMetricConfig
.
Constructor and Description |
---|
HumanEvaluationConfig() |
Modifier and Type | Method and Description |
---|---|
HumanEvaluationConfig |
clone() |
boolean |
equals(Object obj) |
List<HumanEvaluationCustomMetric> |
getCustomMetrics()
A
HumanEvaluationCustomMetric object. |
List<EvaluationDatasetMetricConfig> |
getDatasetMetricConfigs()
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
|
HumanWorkflowConfig |
getHumanWorkflowConfig()
The parameters of the human workflow.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setCustomMetrics(Collection<HumanEvaluationCustomMetric> customMetrics)
A
HumanEvaluationCustomMetric object. |
void |
setDatasetMetricConfigs(Collection<EvaluationDatasetMetricConfig> datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
|
void |
setHumanWorkflowConfig(HumanWorkflowConfig humanWorkflowConfig)
The parameters of the human workflow.
|
String |
toString()
Returns a string representation of this object.
|
HumanEvaluationConfig |
withCustomMetrics(Collection<HumanEvaluationCustomMetric> customMetrics)
A
HumanEvaluationCustomMetric object. |
HumanEvaluationConfig |
withCustomMetrics(HumanEvaluationCustomMetric... customMetrics)
A
HumanEvaluationCustomMetric object. |
HumanEvaluationConfig |
withDatasetMetricConfigs(Collection<EvaluationDatasetMetricConfig> datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
|
HumanEvaluationConfig |
withDatasetMetricConfigs(EvaluationDatasetMetricConfig... datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
|
HumanEvaluationConfig |
withHumanWorkflowConfig(HumanWorkflowConfig humanWorkflowConfig)
The parameters of the human workflow.
|
public void setHumanWorkflowConfig(HumanWorkflowConfig humanWorkflowConfig)
The parameters of the human workflow.
humanWorkflowConfig
- The parameters of the human workflow.public HumanWorkflowConfig getHumanWorkflowConfig()
The parameters of the human workflow.
public HumanEvaluationConfig withHumanWorkflowConfig(HumanWorkflowConfig humanWorkflowConfig)
The parameters of the human workflow.
humanWorkflowConfig
- The parameters of the human workflow.public List<HumanEvaluationCustomMetric> getCustomMetrics()
A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are to be
evaluated, an optional description.
HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are
to be evaluated, an optional description.public void setCustomMetrics(Collection<HumanEvaluationCustomMetric> customMetrics)
A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are to be
evaluated, an optional description.
customMetrics
- A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are
to be evaluated, an optional description.public HumanEvaluationConfig withCustomMetrics(HumanEvaluationCustomMetric... customMetrics)
A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are to be
evaluated, an optional description.
NOTE: This method appends the values to the existing list (if any). Use
setCustomMetrics(java.util.Collection)
or withCustomMetrics(java.util.Collection)
if you want
to override the existing values.
customMetrics
- A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are
to be evaluated, an optional description.public HumanEvaluationConfig withCustomMetrics(Collection<HumanEvaluationCustomMetric> customMetrics)
A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are to be
evaluated, an optional description.
customMetrics
- A HumanEvaluationCustomMetric
object. It contains the names the metrics, how the metrics are
to be evaluated, an optional description.public List<EvaluationDatasetMetricConfig> getDatasetMetricConfigs()
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
public void setDatasetMetricConfigs(Collection<EvaluationDatasetMetricConfig> datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
datasetMetricConfigs
- Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.public HumanEvaluationConfig withDatasetMetricConfigs(EvaluationDatasetMetricConfig... datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
NOTE: This method appends the values to the existing list (if any). Use
setDatasetMetricConfigs(java.util.Collection)
or withDatasetMetricConfigs(java.util.Collection)
if you want to override the existing values.
datasetMetricConfigs
- Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.public HumanEvaluationConfig withDatasetMetricConfigs(Collection<EvaluationDatasetMetricConfig> datasetMetricConfigs)
Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.
datasetMetricConfigs
- Use to specify the metrics, task, and prompt dataset to be used in your model evaluation job.public String toString()
toString
in class Object
Object.toString()
public HumanEvaluationConfig clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.