Skip to content

Commit 59ebd23

Browse files
authored
Updating to cover multiple tasks on 1 server
1 parent a160f5e commit 59ebd23

1 file changed

Lines changed: 40 additions & 2 deletions

File tree

docs/setup.md

Lines changed: 40 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ The following are required to setup the environment. You should gather these var
6868

6969
### Step 5 – Execute the script & update the scheduled task
7070
1. Run the InvokeMasterScript.ps1 now that the variables have been updated.
71-
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1 & OperationalDataUpload1, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
71+
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1_<CloudName> & OperationalDataUpload1_<CloudName>, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
7272
3. Click Run. Operational Data will be piushed every 13 minutes now. Usage Data will be pushed at 9am every day.
7373

7474
The scripts sets up 2 scheduled tasks:
@@ -79,6 +79,44 @@ The data are uploaded to the OMS workspace you specified in the ARM template.
7979

8080
Note: For usage data, the script is setup to query and upload usage data reported from the day before yesterday each time the scheduled task runs. Note that no usage data will be uploaded if there are no tenant usage during the timeframe specified.
8181

82+
## Deploy additional Data Collection scheduled tasks to a data collection VM
83+
### Step 1 - Get required variables
84+
The following are required to setup the environment. You should gather these variables before proceeding to the next step.
85+
#### DeploymentGUID = “<e.g. 41da4fdd-0e5f-4ecb-85d2-52cb85cd1fca>”
86+
1. Access the privileged endpoint
87+
2. Run Get-AzureStackStampInformation
88+
3. Find and copy the deploymentguid from the output
89+
#### azureStackAdminUsername ="<e.g. Serviceadmin@myazurestackinstance.onmicrosoft.com>"
90+
1. Update with the Azure Stack Service Admin account email
91+
#### azureStackAdminPassword = "<e.g. MyAzureStackPassword206!>"
92+
1. Update with the Azure Stack Service Admin account password
93+
#### CloudName ="<e.g. Orlando MTC>"
94+
1. Update location with the name of your Cloud, this is how most data will pivot in the views. Must be Unique with any AzureStack Tasks already running on the system
95+
#### Region = "<e.g. Orlando>"
96+
1. Update with the region name used when deploying Azure Stack
97+
#### Fqdn = "<e.g. azurestack.corp.microsoft.com>"
98+
1. Update with the FQDN name used when deploying Azure Stack
99+
#### OMSWorkpsaceID= "<ID of your log analytics workspace>"
100+
1. Update with the OMS/Log Analytics Workspace ID which can be found in the Advanced Settings pane of your Log Analytics workspace
101+
#### OMSSharedKey = "<Log Analytics Workspace Shared Key>"
102+
1. Update with the OMS/Log Analytics Workspace Primary Key found in the Advanced Settings pane of your Log Analytics workspace
103+
#### OEM = "<replace with your hardware vendor name>"
104+
1. Update with the name of your hardware vendor. Allows for reports in log analytics utilizing the OEM name.
105+
106+
### Step 2 – Update variables
107+
1. Open an elevated PowerShell ISE session
108+
2. Open the file C:\InvokeMasterScript.ps1
109+
3. Update the variables using the data gathered in Step 3
110+
111+
### Step 3 – Execute the script & update the scheduled task
112+
1. Run the InvokeMasterScript.ps1 now that the variables have been updated.
113+
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1_<CloudName> & OperationalDataUpload1_<CloudName>, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
114+
3. Click Run. Operational Data will be pushed every 13 minutes now. Usage Data will be pushed at 9am every day.
115+
116+
The scripts sets up 2 scheduled tasks:
117+
1. Upload of 1-day worth of usage data provided from the Provider Usage API at 9am every day.
118+
2. Upload of operational data every 13 minutes.
119+
82120
## Troubleshooting
83121
1. To verify that data is getting uploaded to OMS environment:
84122

@@ -154,4 +192,4 @@ For specific documentation on the Power BI dashboard template, refer to the [das
154192

155193
## Limitations
156194
1. As of 8/9/2017, there is 8mb size limit on the request that PowerBI uses to fetch data from OMS. This restriction is planned to be lifted in the future by the OMS Log Analytics team.
157-
2. If the restriction persists, one way to increase the number of days of usage data available (by 24x) is pull in daily aggregated usage data from OMS instead of hourly aggregated data. The obvious drawback for this method is that one will not be able to drilldown to the hourly level on the PowerBI dashboard.
195+
2. If the restriction persists, one way to increase the number of days of usage data available (by 24x) is pull in daily aggregated usage data from OMS instead of hourly aggregated data. The obvious drawback for this method is that one will not be able to drilldown to the hourly level on the PowerBI dashboard.

0 commit comments

Comments
 (0)