You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/setup.md
+40-2Lines changed: 40 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,7 +68,7 @@ The following are required to setup the environment. You should gather these var
68
68
69
69
### Step 5 – Execute the script & update the scheduled task
70
70
1. Run the InvokeMasterScript.ps1 now that the variables have been updated.
71
-
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1 & OperationalDataUpload1, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
71
+
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1_<CloudName> & OperationalDataUpload1_<CloudName>, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
72
72
3.Click Run. Operational Data will be piushed every 13 minutes now. Usage Data will be pushed at 9am every day.
73
73
74
74
The scripts sets up 2 scheduled tasks:
@@ -79,6 +79,44 @@ The data are uploaded to the OMS workspace you specified in the ARM template.
79
79
80
80
Note: For usage data, the script is setup to query and upload usage data reported from the day before yesterday each time the scheduled task runs. Note that no usage data will be uploaded if there are no tenant usage during the timeframe specified.
81
81
82
+
## Deploy additional Data Collection scheduled tasks to a data collection VM
83
+
### Step 1 - Get required variables
84
+
The following are required to setup the environment. You should gather these variables before proceeding to the next step.
1. Update with the Azure Stack Service Admin account password
93
+
#### CloudName ="<e.g. Orlando MTC>"
94
+
1. Update location with the name of your Cloud, this is how most data will pivot in the views. Must be Unique with any AzureStack Tasks already running on the system
95
+
#### Region = "<e.g. Orlando>"
96
+
1. Update with the region name used when deploying Azure Stack
1. Update with the OMS/Log Analytics Workspace Primary Key found in the Advanced Settings pane of your Log Analytics workspace
103
+
#### OEM = "<replacewithyourhardwarevendorname>"
104
+
1. Update with the name of your hardware vendor. Allows for reports in log analytics utilizing the OEM name.
105
+
106
+
### Step 2 – Update variables
107
+
1. Open an elevated PowerShell ISE session
108
+
2. Open the file C:\InvokeMasterScript.ps1
109
+
3. Update the variables using the data gathered in Step 3
110
+
111
+
### Step 3 – Execute the script & update the scheduled task
112
+
1. Run the InvokeMasterScript.ps1 now that the variables have been updated.
113
+
2. Once the script completes, open Task Scheduler, right click on the newly created tasks named UsageDataUpload1_<CloudName> & OperationalDataUpload1_<CloudName>, click Properties, and click "Change User or Group" (the Run As account) to the Admin UserName and Password of the UploadToOMSVM VM.
114
+
3.Click Run. Operational Data will be pushed every 13 minutes now. Usage Data will be pushed at 9am every day.
115
+
116
+
The scripts sets up 2 scheduled tasks:
117
+
1. Upload of 1-day worth of usage data provided from the Provider Usage API at 9am every day.
118
+
2. Upload of operational data every 13 minutes.
119
+
82
120
## Troubleshooting
83
121
1. To verify that data is getting uploaded to OMS environment:
84
122
@@ -154,4 +192,4 @@ For specific documentation on the Power BI dashboard template, refer to the [das
154
192
155
193
## Limitations
156
194
1. As of 8/9/2017, there is 8mb size limit on the request that PowerBI uses to fetch data from OMS. This restriction is planned to be lifted in the future by the OMS Log Analytics team.
157
-
2. If the restriction persists, one way to increase the number of days of usage data available (by 24x) is pull in daily aggregated usage data from OMS instead of hourly aggregated data. The obvious drawback for this method is that one will not be able to drilldown to the hourly level on the PowerBI dashboard.
195
+
2. If the restriction persists, one way to increase the number of days of usage data available (by 24x) is pull in daily aggregated usage data from OMS instead of hourly aggregated data. The obvious drawback for this method is that one will not be able to drilldown to the hourly level on the PowerBI dashboard.
0 commit comments