You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To list all of the options for a specific command run:
93
93
Serverless plugin:
94
94
```bash
95
-
sls dynamotdt<command> --help
95
+
sls dynamodt<command> --help
96
96
```
97
97
98
98
## What happens behind the scenes
99
99
- When a data transformation runs for the first time, a record in your table is created. This record is for tracking the executed transformations on a specific table.
100
100
101
-
102
-
## The safe data transformation process
103
-
The next section describes how the data transformation process looks like, and the order of each step.
104
-
### Steps
105
-
#### 1st Phase (Add New Resources)
106
-
1. Update the serverless.yml resources (if needed) \
107
-
Reminder: we are not overriding existing data but creating new. [See some examples](#examples)
108
-
1. Your new code should be able to write to your old and new resources which ensures that we can roll back to the previous state and prevent possible data gaps.
109
-
1. Create a pull request and deploy it to every stage in your application
110
-
111
-
#### 2nd Phase (data transformation)
112
-
113
-
1. For the first time use `sls dynamodt init` it will generate a folder per table inside the root folder of your service (The name of the folder is the exact name of the table).
114
-
A template data transformation file (v1.js) will be created in each table folder. \
115
-
Implement these functions:
116
-
1. `transformUp` - transform all of the table items to the new shape (use preparationData if needed).
117
-
1. `transformDown` - transform all of the table items to the previous shape.
118
-
1. `prepare` - use this function whenever your data transformation relies on data from external resources.
119
-
120
-
1. Export these functions and export the version of the current data transformation (set the sequence variable value. It should be the same number of the file name).
121
-
122
-
1. Preparing data from external resources for the data transformation can be done by using `sls dynamodt prepare`
123
-
124
-
Run `sls dynamodt prepare --tNumber <transformation_number> --table <table>`\
125
-
The data will be stored in a S3 bucket \
126
-
The data will be decrypted while running the data transformation script.
127
-
128
-
1.**Final Step** Create a pull request. \
129
-
Note that the data transformation runs after an sls deploy command it is integrated \
130
-
with lifecycle of serverless `after:deploy:deploy` hook.
131
-
132
-
#### 3rd Phase (Use The New Resources/Data)
133
-
1. Adjust your code to work with the new data. \
134
-
For example, read from the new index instead of the old one.
135
-
1. Create a pull request with the updated lambdas.
136
-
137
-
138
-
#### 4th Phase (Cleanup)
139
-
1. Clean the unused data (attributes/indexes/etc).
140
-
141
-
142
-
### Key Concepts
143
-
First of all, keep in mind that our mission is to prevent downtime while executing data transformations.
144
-
- Don't override resources/data
145
-
- Your code should be able to work with the old version of the data and keep it updated.
146
-
- To be continued...
147
-
148
-
149
-
150
-
### Data Transformation Script Format (e.g v1_script.js)
101
+
## Data Transformation Script Format (e.g v1_script.js)
The next section describes how the data transformation process looks like, and the order of each step.
3
+
## Steps
4
+
### 1st Phase (Add New Resources)
5
+
1. Update the table resources if needed \
6
+
Reminder: we are not overriding existing data but creating new.
7
+
1. Your new code should be able to write to your old and new resources which ensures that we can roll back to the previous state and prevent possible data gaps.
8
+
1. Create a pull request and deploy it to every stage in your application
9
+
10
+
### 2nd Phase (data transformation)
11
+
12
+
1. For the first time use `sls dynamodt init` it will generate a folder per table inside the root folder of your service (The name of the folder is the exact name of the table).
13
+
A template data transformation file (v1.js) will be created in each table folder. \
14
+
Implement these functions:
15
+
1. `transformUp` - transform all of the table items to the new shape (use preparationData if needed).
16
+
1. `transformDown` - transform all of the table items to the previous shape.
17
+
1. `prepare` - use this function whenever your data transformation relies on data from external resources.
18
+
19
+
1. Export these functions and export the version of the current data transformation (set the sequence variable value. It should be the same number of the file name).
20
+
21
+
1. Preparing data from external resources for the data transformation can be done by using `sls dynamodt prepare`
22
+
23
+
Run `sls dynamodt prepare --tNumber <transformation_number> --table <table>`\
24
+
The data will be stored in a S3 bucket \
25
+
The data will be decrypted while running the data transformation script.
26
+
27
+
1.**Final Step** Create a pull request. \
28
+
Note that the data transformation runs after an sls deploy command it is integrated \
29
+
with lifecycle of serverless `after:deploy:deploy` hook.
30
+
31
+
### 3rd Phase (Use The New Resources/Data)
32
+
1. Adjust your code to work with the new data. \
33
+
For example, read from the new index instead of the old one.
34
+
1. Create a pull request with the updated lambdas.
35
+
36
+
37
+
### 4th Phase (Cleanup)
38
+
1. Clean the unused data (attributes/indexes/etc).
39
+
40
+
41
+
### Key Concepts
42
+
First of all, keep in mind that our mission is to prevent downtime while executing data transformations.
43
+
- Don't override resources/data
44
+
- Your code should be able to work with the old version of the data and keep it updated.
45
+
- Prefer multiple data transformations over complex one.
0 commit comments