Azure onboarding with Yotascale Security

Yotascale currently only needs access to the Cost Management Export files that provide daily
Cost Management data which is exported by Azure into Azure Blob Storage.
To access these Export files (which are generated in CSV format) we only need access to the
Storage account Access Keys.
There are as such two-step to automate the process of providing read access to the Export CSV
files on a daily basis.


One-time process to get the Export files location and
respective Access Keys to ready data daily


Yotascale needs to capture information about the Azure Storage Account where the cost Export
files are saved on a daily basis. The information needed is:

  1. Storage account name

  2. Container name

  3. Directory name

  4. Export file name (there are two of them, one for Actual cost, and one for Amortized)


This is a screenshot of how they are originally created in the Azure Portal:

To make the onboarding process easier, we ask the customer to have a user that has “Read
and Data Access” Role to the subscription where the Export files are stored, so that we can
grab the storage account name, container, and directory, and export file name automatically.
We do not store any user information, we only need these fields plus the Storage blog Access
Keys.
During the Onboarding process with Yotascale, we read this data

  • On-time steps to capture information about where the Cost Export CSV files

    • Subscriptions list (to allow the user to pick one from the drop-down)

    • Storage accounts per subscription (to allow the user to pick one from the
      drop-down)

    • Container List (to allow the user to pick one from the drop-down)

    • A free text field for the user to type the Directory (Azure does not have an API for
      that)

    • A free text field for the user to type the Export file name (Azure does not have an
      API for that)


What we store at Yotascale are:
● Storage account name
● Container name
● Directory name
● Export filename
● Storage account access key

Azure onboarding end-points we call while we do the initial onboarding:

listSubscriptions:
<https://management.azure.com/subscriptions?api-version=2018-02-01>

storageAccounts:
<https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsof> t.Storage/storageAccounts?api-version=2019-06-01

storageAccountListKey:
<https://management.azure.com{subscriptionsID_resourceGroup}/listKeys?api-versi> on=2019-06-01

containers:
<https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{re> sourceGroupName}/providers/Microsoft.Storage/storageAccounts/{storageAccountNam e}/blobServices/default/containers?api-version=2021-04-01

exportFileAtSubscription:
<https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsof> t.CostManagement/exports?api-version=2020-06-01

This is the UI modal where we collect the values mentioned above:

All the endpoints are called from this IP address range (the Yotascale production):
● 172.31.0.0/16 is our VPC CIDR which hosts all our infrastructure/microservices.
● Our web application is hosted via cloudfront in addition to microservices hosted within the
above VPC. IP ranges for Cloudfront can be found here:
https://d7uri8nf7uskq.cloudfront.net/tools/list-cloudfront-ips
{"CLOUDFRONT_GLOBAL_IP_LIST": ["120.52.22.96/27", "205.251.249.0/24",
"180.163.57.128/26", "204.246.168.0/22", "205.251.252.0/23", "54.192.0.0/16",
"204.246.173.0/24", "54.230.200.0/21", "120.253.240.192/26",
"116.129.226.128/26", "130.176.0.0/17", "108.156.0.0/14", "99.86.0.0/16",
"205.251.200.0/21", "223.71.71.128/25", "13.32.0.0/15", "120.253.245.128/26",
"13.224.0.0/14", "70.132.0.0/18", "15.158.0.0/16", "13.249.0.0/16",
"205.251.208.0/20", "65.9.128.0/18", "130.176.128.0/18", "58.254.138.0/25",
"54.230.208.0/20", "116.129.226.0/25", "52.222.128.0/17", "64.252.128.0/18",
"205.251.254.0/24", "54.230.224.0/19", "71.152.0.0/17", "216.137.32.0/19",
"204.246.172.0/24", "120.52.39.128/27", "118.193.97.64/26", "223.71.71.96/27",

"54.240.128.0/18", "205.251.250.0/23", "180.163.57.0/25", "52.46.0.0/18",
"223.71.11.0/27", "52.82.128.0/19", "54.230.0.0/17", "54.230.128.0/18",
"54.239.128.0/18", "130.176.224.0/20", "36.103.232.128/26", "52.84.0.0/15",
"143.204.0.0/16", "144.220.0.0/16", "120.52.153.192/26", "119.147.182.0/25",
"120.232.236.0/25", "54.182.0.0/16", "58.254.138.128/26", "120.253.245.192/27",
"54.239.192.0/19", "18.64.0.0/14", "120.52.12.64/26", "99.84.0.0/16",
"130.176.192.0/19", "52.124.128.0/17", "204.246.164.0/22", "13.35.0.0/16",
"204.246.174.0/23", "36.103.232.0/25", "119.147.182.128/26",
"118.193.97.128/25", "120.232.236.128/26", "204.246.176.0/20", "65.8.0.0/16",
"65.9.0.0/17", "108.138.0.0/15", "120.253.241.160/27", "64.252.64.0/18"],
"CLOUDFRONT_REGIONAL_EDGE_IP_LIST": ["13.113.196.64/26", "13.113.203.0/24",
"52.199.127.192/26", "13.124.199.0/24", "3.35.130.128/25", "52.78.247.128/26",
"13.233.177.192/26", "15.207.13.128/25", "15.207.213.128/25",
"52.66.194.128/26", "13.228.69.0/24", "52.220.191.0/26", "13.210.67.128/26",
"13.54.63.128/26", "99.79.169.0/24", "18.192.142.0/23", "35.158.136.0/24",
"52.57.254.0/24", "13.48.32.0/24", "18.200.212.0/23", "52.212.248.0/26",
"3.10.17.128/25", "3.11.53.0/24", "52.56.127.0/25", "15.188.184.0/24",
"52.47.139.0/24", "18.229.220.192/26", "54.233.255.128/26", "3.231.2.0/25",
"3.234.232.224/27", "3.236.169.192/26", "3.236.48.0/23", "34.195.252.0/24",
"34.226.14.0/24", "13.59.250.0/26", "18.216.170.128/25", "3.128.93.0/24",
"3.134.215.0/24", "52.15.127.128/26", "3.101.158.0/23", "52.52.191.128/26",
"34.216.51.0/25", "34.223.12.224/27", "34.223.80.192/26", "35.162.63.192/26",
"35.167.191.128/26", "44.227.178.0/24", "44.234.108.128/25",
"44.234.90.252/30"]}

Daily basis: what we need


On a daily basis, we have a service that copies the Export CSV files from the customer Azure
storage account > container > directory > exportname > date-range > CVS files
into a Yotascale AWS S3 bucket.


As such, the parameters we use t copy the files are:
● StorageAccountName
● ContainerName
● DirectoryName
● ExportName
● AccessKey


We use a python library to copy the Export files from Azure to Yotascale. This is the Microsoft
Azure Blob Storage Client Library for Python we use:
https://pypi.org/project/azure-storage-blob/


The end point it calls are done through:
"<https://<my-storage-account-name>.blob.core.windows.net/">

 

The IP address range where we make these API calls (microservice) that copies
the CSV files from Azure to Yotascale
● 172.31.0.0/16 is our VPC CIDR which hosts all our infrastructure/microservices.
● Our web application is hosted via cloudfront in addition to microservices hosted within the
above VPC. IP ranges for Cloudfront can be found here:
https://d7uri8nf7uskq.cloudfront.net/tools/list-cloudfront-ips


Alternative way to do the one-time UI onboarding


For the one-time onboarding where we save the information about the storage account, access
key, container, directory and export name, we could alternatively do a form process where the
customer pastes the values.


We can provide a form as an alternative if necessary to capture these parameters. This is where
the customer paste the values shown in “” below into a web form, in the Yotascale
dashboard.


"ExportActualCosts": {
"storage_account_name": "",
"storage_ccess_key": "",
"container_name": "",
"directory_name": "",
"cost_export_name": ""
}
"ExportAmortizedCosts": {
"storage_account_name": "",
"storage_ccess_key": "",
"container_name": "",
"directory_name": "",
"cost_export_name": "****"