How to install the Azure ML SDK

The latest info is always here.

How to retrieve the current workspace from a remote Run context

ws = Run.get_context().experiment.workspace
print(ws)

Useful in pipelines, see this article for a complete example.

How to download a model from the models repository

model_ws = Model(ws, '<your-model-name>')
pickled_model_path = model_ws.download(exist_ok = True)

model = joblib.load(pickled_model_path)

print(model)

Testing pipeline steps locally

This mostly works with the current version of the SDK, however when trying to access stuff like the current experiment or its workspace we get a nice AttributeError: '_OfflineRun' object has no attribute 'experiment'. You can prevent this by checking if the experiment property exists and initialize the workspace from a local config.json if it doesn’t.

aml_context = Run.get_context()
# assumes a config.json file exists in the current or the parent directory
ws = Workspace.from_config() if not hasattr(aml_context, 'experiment') else aml_context.experiment.workspace

Alternate way which looks better to me as a C# dev, but I can’t stop wondering if it’s pythonic enough. 🤨

from azureml.core.run import _OfflineRun

aml_context = Run.get_context()
ws = Workspace.from_config() if type(aml_context) == _OfflineRun else aml_context.experiment.workspace

Requesting a compute quota increase

By default you get 10 cores, which may or may not be enough. You can increase them by contacting support, see this and this for details.

Authenticating to a different tenant than usual

from azureml.core import Workspace
from azureml.core.authentication import InteractiveLoginAuthentication

forced_interactive_auth = InteractiveLoginAuthentication(tenant_id='your-tenant-id', force=True)
ws = Workspace.from_config(auth=forced_interactive_auth)

– via Azure Machine Learning Notebooks

High costs associated with the Load Balancer of the Machine Learning Workspace

Even though you’re not consciously using any load balancer in there. In my experience, this tends to happen after you’ve created some compute instances to run notebooks on. What worked for me was to completely delete those compute instances (and never use notebooks in AML again 🤷🏻‍♂️), see this thread for some other recommendations.

Debugging online endpoints locally

Azure Machine Learning inference HTTP server is really nice.

python -m pip install azureml-inference-server-http

azmlinfsrv --entry_script score.py

You can also use VS Code, but I like this approach better.