英文:
Sagemaker HyperparameterTuner and fixed hyper parameters (StaticHyperParameters)
问题
我曾经使用这种类型的超参数(优化)规范:
"OutputDataConfig": {"S3OutputPath": output_path},
"ResourceConfig": {"InstanceCount": 1, "InstanceType": "ml.m4.xlarge", "VolumeSizeInGB": 3},
"RoleArn": role_arn,
"StaticHyperParameters": {
"objective": "reg:squarederror"
},
"StoppingCondition": {"MaxRuntimeInSeconds": 10000}
老实说,我甚至不知道这是一种旧的做法还是不同的SDK - SageMaker 有时令人困惑。无论如何,我想使用此SDK/API - 更准确地说是HyperparameterTuner。我应该如何指定StaticHyperParameters(例如"objective":"quantile")?只需不为此超参数提供范围并硬编码它吗?谢谢!
英文:
I used to use this type of hyper parameter (optimisation) specification:
"OutputDataConfig": {"S3OutputPath": output_path},
"ResourceConfig": {"InstanceCount": 1, "InstanceType": "ml.m4.xlarge", "VolumeSizeInGB": 3},
"RoleArn": role_arn,
"StaticHyperParameters": {
"objective": "reg:squarederror"
},
"StoppingCondition": {"MaxRuntimeInSeconds": 10000}
TBH I do not even know if this is an old way of doing things or a different SDK - very confusing Sagemaker sometimes. Anyway, I want to use this SDK/API instead - more precisely the HyperparameterTuner. How would I specify StaticHyperParameters (e.g. "objective":"quantile")? Simply by not giving this hyperparameter a range and hard coding it? Thanks!
答案1
得分: 1
超参数调整器(hyperparameterTuner)将一个估算器对象作为参数之一。您可以将静态超参数作为估算器的一部分,如下所示:
estimator = PyTorch(
entry_point="mnist.py",
role=role,
py_version="py3",
framework_version="1.8.0",
instance_count=1,
instance_type="ml.c5.2xlarge",
hyperparameters={"epochs": 1, "backend": "gloo"},
)
一旦您初始化了估算器,您可以将其与需要调整的参数一起传递给调整器,如下所示:
hyperparameter_ranges = {
"lr": ContinuousParameter(0.001, 0.1),
"batch-size": CategoricalParameter([32, 64, 128, 256, 512]),
}
tuner = HyperparameterTuner(
estimator,
objective_metric_name,
hyperparameter_ranges,
metric_definitions,
max_jobs=9,
max_parallel_jobs=3,
objective_type=objective_type,
)
请参考此示例以获取完整解决方案:
英文:
The hyperparameterTuner takes an Estimator object as one of the parameters. You can keep static hyperparameters as part of the estimator something like below
estimator = PyTorch(
entry_point="mnist.py",
role=role,
py_version="py3",
framework_version="1.8.0",
instance_count=1,
instance_type="ml.c5.2xlarge",
hyperparameters={"epochs": 1, "backend": "gloo"},
)
Once you have the estimator initialized you can pass this to Tuner along with Parameters that has to be tuned as shown below
hyperparameter_ranges = {
"lr": ContinuousParameter(0.001, 0.1),
"batch-size": CategoricalParameter([32, 64, 128, 256, 512]),
}
tuner = HyperparameterTuner(
estimator,
objective_metric_name,
hyperparameter_ranges,
metric_definitions,
max_jobs=9,
max_parallel_jobs=3,
objective_type=objective_type,
)
Please refer this example for a complete solution
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论