英文:
How can I save an Azure ML sklearn model to a specific blob storage container?
问题
以下是翻译好的部分:
"Saving an sklearn model to blob storage from AzureML" 可以翻译为 "将AzureML中的sklearn模型保存到Blob存储"
"Simply put, I have a sklearn model in Azure ML. I want to write the model to a specific blob storage container to save it. I've already authenticated the connection to the right blob container. Per docs :" 可以翻译为 "简而言之,我在Azure ML中有一个sklearn模型。我想将这个模型写入特定的Blob存储容器以进行保存。我已经完成了对正确Blob容器的连接身份验证。根据文档:"
# Create a blob client using the local file name as the name for the blob
blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)
# Upload the created file
with open(file=upload_file_path, mode="rb") as data:
blob_client.upload_blob(data)
"However, this requires me to first save the model to a local file. Is there a local file store associated with AzureML?" 可以翻译为 "然而,这需要我首先将模型保存到本地文件。AzureML是否有关联的本地文件存储?"
"Alternatively, could I write back to a datastore but point the datastore to a specific storage container?" 可以翻译为 "或者,我可以写回到数据存储,但将数据存储指向特定的存储容器吗?"
英文:
Saving an sklearn model to blob storage from AzureML
Simply put, I have a sklearn model in Azure ML. I want to write the model to a specific blob storage container to save it. I've already authenticated the connection to the right blob container. Per docs :
# Create a blob client using the local file name as the name for the blob
blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)
# Upload the created file
with open(file=upload_file_path, mode="rb") as data:
blob_client.upload_blob(data)
However, this requires me to first save the model to a local file. Is there a local file store associated with AzureML?
Alternatively, could I write back to a datastore but point the datastore to a specific storage container?
答案1
得分: 1
First, save your model in local as below.
mlflow.sklearn.save_model(
sk_model=clf,
path=os.path.join(os.path.abspath('credit_defaults_model'), "trained_model")
)
Here, os.path.abspath('credit_defaults_model')
gives path to the local filesystem.
There you can save the model executing the above ml code.
Then you will get files highlighted in the image.
Later, you upload the model.pkl
file to blob storage.
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
connection_string = "<your connection string>"
container_name = "data"
local_file_path = os.path.join(os.path.abspath('credit_defaults_model'), "trained_model/model.pkl")
file_name = "from_ml/sklearn.pkl"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(container_name)
blob_client = container_client.get_blob_client(file_name)
with open(local_file_path, "rb") as data:
blob_client.upload_blob(data)
print("File uploaded successfully!")
Output:
Here, the model is uploaded successfully to the blob.
英文:
First, save your model in local as below.
mlflow.sklearn.save_model(
sk_model=clf,
path=os.path.join(os.path.abspath('credit_defaults_model'), "trained_model")
)
Here, os.path.abspath('credit_defaults_model')
gives path to local filesystem.
There you can save the model executing above ml code.
Then you will get files highlighted in image.
Later, you upload the model.pkl
file to blob storage.
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
connection_string = "<your connection string>"
container_name = "data"
local_file_path = os.path.join(os.path.abspath('credit_defaults_model'), "trained_model/model.pkl")
file_name = "from_ml/sklearn.pkl"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(container_name)
blob_client = container_client.get_blob_client(file_name)
with open(local_file_path, "rb") as data:
blob_client.upload_blob(data)
print("File uploaded successfully!")
Output:
Here, model is uploaded successfully to blob.
答案2
得分: 0
没有必要使用临时文件来上传您的模型。您可以直接将其传递给upload_blob函数。Data
可以是一个pickle,例如。
英文:
There's no need to use a temporary file to upload your model. You can simply pass it directly to the upload_blob function. Data
can be a pickle, for example.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论