I can reproduce your issue, this line ds_ls ="AzureBlobStorage" is wrong, it should be ds_ls = LinkedServiceReference(reference_name=ls_name).

You could refer to my complete working sample.
Make sure your service principal has an RBAC role(e.g Owner,Contributor) in the Access control (IAM) of your data factory and you have done all the Prerequisites.
My package version:
azure-mgmt-datafactory 0.6.0
azure-mgmt-resource 3.1.0
azure-common 1.1.23
Code:
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *
subscription_id = '<subscription-id>'
ls_name = 'storageLinkedService'
rg_name = '<group-name>'
df_name = '<datafactory-name>'
credentials = ServicePrincipalCredentials(client_id='<client id of the service principal>',
secret='<secret of the service principal>', tenant='<tenant-id>')
resource_client = ResourceManagementClient(credentials, subscription_id)
adf_client = DataFactoryManagementClient(credentials, subscription_id)
storage_string = SecureString('DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=<storage account key>')
ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)
ds_ls = LinkedServiceReference(reference_name=ls_name)
# Create an Azure blob dataset (output)
dsOut_name = 'ds_out'
output_blobpath = '<container name>/<folder name>'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print(dsOut)
