Skip to content

Commit 3c423ad

Browse files
committed
Update ReadMe file
#5
1 parent 75d4dde commit 3c423ad

2 files changed

Lines changed: 268 additions & 16 deletions

File tree

README.md

Lines changed: 133 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# FlowSynx CSV Plugin – FlowSynx Platform Integration
1+
# FlowSynx CSV Plugin
22

33
The CSV Plugin is a pre-packaged, plug-and-play integration component for the FlowSynx engine. It enables reading from and writing to CSV files with configurable parameters such as file path, delimiter, headers, and encoding. Designed for FlowSynx’s no-code/low-code automation workflows, this plugin simplifies data extraction and transformation tasks.
44

@@ -8,16 +8,142 @@ This plugin is automatically installed by the FlowSynx engine when selected with
88

99
## Purpose
1010

11-
This plugin allows FlowSynx users to interact with CSV data within workflows without writing code. Once installed, it appears as a connector in the platform’s workflow builder, enabling seamless integration of CSV-based data operations such as importing, exporting, and transformation into automated processes.
11+
The CSV Plugin allows FlowSynx users to:
12+
13+
- Parse and inspect CSV structures.
14+
- Map CSV data to specific output fields.
15+
- Filter CSV data based on conditions.
16+
- Transform CSV data inline for downstream workflows.
17+
18+
---
19+
20+
## Supported Operations
21+
22+
- **filter**: Filters rows in the CSV using defined `Filter` conditions. Supports logical operations (`and`, `or`) and common operators like `equals`, `contains`, `startsWith`, `endsWith`, `greaterThan`, and `lessThan`.
23+
- **map**: Maps existing fields in the CSV to a new subset of keys or column arrangement for simplified output.
24+
25+
---
26+
27+
## Input Parameters
28+
29+
The plugin accepts the following parameters:
30+
31+
- `Operation` (string): **Required.** The type of operation to perform. Supported values are `filter` and `map`.
32+
- `Data` (string/object): **Required.** The raw CSV string to process.
33+
- `Delimiter` (string): Optional. Defaults to `,`. The character used to separate fields in the CSV.
34+
- `Mappings` (list): **Required for `map` operation.** Defines which fields to include in the output.
35+
- `IgnoreBlankLines` (bool): Optional. Specifies whether blank lines in the CSV should be ignored (`true`) or treated as data rows (`false`). Defaults to `true`.
36+
- `Filters` (object): Optional. Used with the `filter` operation to define filtering criteria.
37+
38+
### Example input
39+
40+
```json
41+
{
42+
"Operation": "map",
43+
"Data": { ... },
44+
"Mappings": ["LastName", "Email"],
45+
"IgnoreBlankLines": true,
46+
"Delimiter": ","
47+
}
48+
```
49+
50+
---
51+
52+
## Operation Examples
53+
54+
### map Operation
55+
56+
**Input Data:**
57+
```csv
58+
CustomerID,FirstName,LastName,Email,Phone,Country
59+
1,John,Doe,john.doe@example.com,1234,USA
60+
2,Jane,Smith,jane.smith@example.com,201234,UK
61+
3,Raj,Patel,raj.patel@example.com,98765,India
62+
4,Anna,Schmidt,anna.schmidt@example.com,30234,Germany
63+
5,Maria,Gonzalez,maria.gonzalez@example.com,911234,Spain
64+
```
65+
66+
**Input Parameters:**
67+
```json
68+
{
69+
"Operation": "map",
70+
"Data": { ... },
71+
"Mappings": ["LastName", "Email"],
72+
"IgnoreBlankLines": true,
73+
"Delimiter": ","
74+
}
75+
```
76+
77+
**Output:**
78+
```json
79+
LastName,Email
80+
Doe,john.doe@example.com
81+
Smith,jane.smith@example.com
82+
Patel,raj.patel@example.com
83+
Schmidt,anna.schmidt@example.com
84+
Gonzalez,maria.gonzalez@example.com
85+
```
86+
87+
---
88+
89+
### filter Operation
90+
91+
**Input Data:**
92+
```csv
93+
CustomerID,FirstName,LastName,Email,Phone,Country
94+
1,John,Doe,john.doe@example.com,1234,USA
95+
2,Jane,Smith,jane.smith@example.com,201234,UK
96+
3,Raj,Patel,raj.patel@example.com,98765,India
97+
4,Anna,Schmidt,anna.schmidt@example.com,30234,Germany
98+
5,Maria,Gonzalez,maria.gonzalez@example.com,911234,Spain
99+
```
100+
101+
**Input Parameters:**
102+
```json
103+
{
104+
"Operation": "filter",
105+
"Data": { ... },
106+
"Filters": {
107+
"Logic": "and",
108+
"Filters": [
109+
{
110+
"Column": "Country",
111+
"Operator": "equals",
112+
"Value": "USA"
113+
},
114+
{
115+
"Column": "FirstName",
116+
"Operator": "startsWith",
117+
"Value": "J"
118+
}
119+
]
120+
},
121+
"IgnoreBlankLines": true,
122+
"Delimiter": ","
123+
}
124+
```
125+
126+
**Output:**
127+
```csv
128+
CustomerID,FirstName,LastName,Email,Phone,Country
129+
1,John,Doe,john.doe@example.com,1234,USA
130+
131+
```
132+
133+
## Debugging Tips
134+
135+
- Ensure that the `Delimiter` matches the file’s actual separator (`,` for standard CSV, `;` or `\t` for others).
136+
- Validate that `Mappings` and `Filters` reference columns that exist in the CSV header row.
137+
- If unexpected rows are excluded or included during filtering, check the logical operators (`and` / `or`) and ensure that data types align (e.g., string comparisons for string fields).
138+
- To troubleshoot encoding issues, verify that the CSV input uses UTF-8 or specify encoding explicitly if supported by FlowSynx.
12139

13140
---
14141

15-
## Notes
142+
## Security Notes
16143

17-
- This plugin is exclusively supported on the FlowSynx platform.
18-
- It is installed automatically by the FlowSynx engine.
19-
- All operational logic is securely managed and executed by FlowSynx.
20-
- File access paths and related settings are configured via platform settings or workflow parameters.
144+
- No data is persisted unless explicitly configured.
145+
- All operations run in a secure sandbox within FlowSynx.
146+
- Only authorized platform users can view or modify configurations.
21147

22148
---
23149

src/README.md

Lines changed: 135 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,152 @@
1-
## FlowSynx CSV Plugin – FlowSynx Platform Integration
1+
## FlowSynx CSV Plugin
22

3-
The CSV Plugin is a pre-packaged, plug-and-play integration component for the FlowSynx engine. It enables reading from and writing to CSV files with configurable parameters such as file path, delimiter, headers, and encoding. Designed for FlowSynx’s no-code/low-code automation workflows, this plugin simplifies data extraction and transformation tasks.
3+
The CSV Plugin is a pre-packaged, plug-and-play integration component for the FlowSynx engine. It enables reading from and writing to CSV files with configurable parameters such as file path, delimiter, headers, and encoding. Designed for FlowSynx’s no-code/low-code automation workflows, this plugin simplifies data extraction and transformation tasks.
44

55
This plugin is automatically installed by the FlowSynx engine when selected within the platform. It is not intended for manual installation or standalone developer use outside the FlowSynx environment.
66

77
---
88

99
## Purpose
1010

11-
This plugin allows FlowSynx users to interact with CSV data within workflows without writing code. Once installed, it appears as a connector in the platform’s workflow builder, enabling seamless integration of CSV-based data operations such as importing, exporting, and transformation into automated processes.
11+
The CSV Plugin allows FlowSynx users to:
12+
13+
- Parse and inspect CSV structures.
14+
- Map CSV data to specific output fields.
15+
- Filter CSV data based on conditions.
16+
- Transform CSV data inline for downstream workflows.
17+
18+
---
19+
20+
## Supported Operations
21+
22+
- **filter**: Filters rows in the CSV using defined `Filter` conditions. Supports logical operations (`and`, `or`) and common operators like `equals`, `contains`, `startsWith`, `endsWith`, `greaterThan`, and `lessThan`.
23+
- **map**: Maps existing fields in the CSV to a new subset of keys or column arrangement for simplified output.
24+
25+
---
26+
27+
## Input Parameters
28+
29+
The plugin accepts the following parameters:
30+
31+
- `Operation` (string): **Required.** The type of operation to perform. Supported values are `filter` and `map`.
32+
- `Data` (string/object): **Required.** The raw CSV string to process.
33+
- `Delimiter` (string): Optional. Defaults to `,`. The character used to separate fields in the CSV.
34+
- `Mappings` (list): **Required for `map` operation.** Defines which fields to include in the output.
35+
- `IgnoreBlankLines` (bool): Optional. Specifies whether blank lines in the CSV should be ignored (`true`) or treated as data rows (`false`). Defaults to `true`.
36+
- `Filters` (object): Optional. Used with the `filter` operation to define filtering criteria.
37+
38+
### Example input
39+
40+
```json
41+
{
42+
"Operation": "map",
43+
"Data": { ... },
44+
"Mappings": ["LastName", "Email"],
45+
"IgnoreBlankLines": true,
46+
"Delimiter": ","
47+
}
48+
```
49+
50+
---
51+
52+
## Operation Examples
53+
54+
### map Operation
55+
56+
**Input Data:**
57+
```csv
58+
CustomerID,FirstName,LastName,Email,Phone,Country
59+
1,John,Doe,john.doe@example.com,1234,USA
60+
2,Jane,Smith,jane.smith@example.com,201234,UK
61+
3,Raj,Patel,raj.patel@example.com,98765,India
62+
4,Anna,Schmidt,anna.schmidt@example.com,30234,Germany
63+
5,Maria,Gonzalez,maria.gonzalez@example.com,911234,Spain
64+
```
65+
66+
**Input Parameters:**
67+
```json
68+
{
69+
"Operation": "map",
70+
"Data": { ... },
71+
"Mappings": ["LastName", "Email"],
72+
"IgnoreBlankLines": true,
73+
"Delimiter": ","
74+
}
75+
```
76+
77+
**Output:**
78+
```json
79+
LastName,Email
80+
Doe,john.doe@example.com
81+
Smith,jane.smith@example.com
82+
Patel,raj.patel@example.com
83+
Schmidt,anna.schmidt@example.com
84+
Gonzalez,maria.gonzalez@example.com
85+
```
86+
87+
---
88+
89+
### filter Operation
90+
91+
**Input Data:**
92+
```csv
93+
CustomerID,FirstName,LastName,Email,Phone,Country
94+
1,John,Doe,john.doe@example.com,1234,USA
95+
2,Jane,Smith,jane.smith@example.com,201234,UK
96+
3,Raj,Patel,raj.patel@example.com,98765,India
97+
4,Anna,Schmidt,anna.schmidt@example.com,30234,Germany
98+
5,Maria,Gonzalez,maria.gonzalez@example.com,911234,Spain
99+
```
100+
101+
**Input Parameters:**
102+
```json
103+
{
104+
"Operation": "filter",
105+
"Data": { ... },
106+
"Filters": {
107+
"Logic": "and",
108+
"Filters": [
109+
{
110+
"Column": "Country",
111+
"Operator": "equals",
112+
"Value": "USA"
113+
},
114+
{
115+
"Column": "FirstName",
116+
"Operator": "startsWith",
117+
"Value": "J"
118+
}
119+
]
120+
},
121+
"IgnoreBlankLines": true,
122+
"Delimiter": ","
123+
}
124+
```
125+
126+
**Output:**
127+
```csv
128+
CustomerID,FirstName,LastName,Email,Phone,Country
129+
1,John,Doe,john.doe@example.com,1234,USA
130+
131+
```
132+
133+
## Debugging Tips
134+
135+
- Ensure that the `Delimiter` matches the file’s actual separator (`,` for standard CSV, `;` or `\t` for others).
136+
- Validate that `Mappings` and `Filters` reference columns that exist in the CSV header row.
137+
- If unexpected rows are excluded or included during filtering, check the logical operators (`and` / `or`) and ensure that data types align (e.g., string comparisons for string fields).
138+
- To troubleshoot encoding issues, verify that the CSV input uses UTF-8 or specify encoding explicitly if supported by FlowSynx.
12139

13140
---
14141

15-
## Notes
142+
## Security Notes
16143

17-
- This plugin is exclusively supported on the FlowSynx platform.
18-
- It is installed automatically by the FlowSynx engine.
19-
- All operational logic is securely managed and executed by FlowSynx.
20-
- File access paths and related settings are configured via platform settings or workflow parameters.
144+
- No data is persisted unless explicitly configured.
145+
- All operations run in a secure sandbox within FlowSynx.
146+
- Only authorized platform users can view or modify configurations.
21147

22148
---
23149

24150
## License
25151

26-
© FlowSynx. All rights reserved.
152+
Copyright FlowSynx. All rights reserved.

0 commit comments

Comments
 (0)